Random sample consensus: Difference between revisions

Content deleted Content added
Algorithm: Clarified each term in the pseudo code.
m Example Code: MOS:HEAD
Line 75:
'''return''' bestFit
 
== Example Codecode ==
 
A Python implementation mirroring the pseudocode. This also defines a <code>LinearRegressor</code> based on least squares, applies <code>RANSAC</code> to a 2D regression problem, and visualizes the outcome:
Line 99:
 
def fit(self, X, y):
 
for _ in range(self.k):
ids = rng.permutation(X.shape[0])
Line 129 ⟶ 128:
def predict(self, X):
return self.best_fit.predict(X)
 
 
def square_error_loss(y_true, y_pred):
return (y_true - y_pred) ** 2
 
 
def mean_square_error(y_true, y_pred):
return np.sum(square_error_loss(y_true, y_pred)) / y_true.shape[0]
 
 
class LinearRegressor:
Line 151:
X = np.hstack([np.ones((r, 1)), X])
return X @ self.params
 
 
if __name__ == "__main__":
Line 171 ⟶ 172:
plt.plot(line, regressor.predict(line), c="peru")
plt.show()
 
</syntaxhighlight>
[[File:RANSAC applied to 2D regression problem.png|alt=A scatterplot showing a diagonal line from the bottom left to top right of the figure. A trend line fits closely along the diagonal, without being thrown off by outliers scattered elsewhere in the figure.|center|thumb|Result of running the <code>RANSAC</code> implementation. The orange line shows the least squares parameters found by the iterative approach, which successfully ignores the outlier points.]]