Content deleted Content added
mNo edit summary |
mNo edit summary |
||
Line 6:
The adjective ''simple'' refers to the fact that the outcome variable is related to a single predictor.
It is common to make the additional stipulation that the [[ordinary least squares]] (OLS) method should be used: the accuracy of each predicted value is measured by its squared ''[[errors and residuals|residual]]'' (vertical distance between the point of the data set and the fitted line), and the goal is to make the sum of these squared deviations as small as possible. There is an underlying assumption that only the dependent variable contains measurement error; if the explanatory variable is also measured with error, then
Other regression methods that can be used in place of ordinary least squares include [[least absolute deviations]] (minimizing the sum of absolute values of residuals) and the [[Theil–Sen estimator]] (which chooses a line whose [[slope]] is the [[median]] of the slopes determined by pairs of sample points). [[Deming regression]] (total least squares) also finds a line that fits a set of two-dimensional sample points, but (unlike ordinary least squares, least absolute deviations, and median slope regression) it is not really an instance of simple linear regression, because it does not separate the coordinates into one dependent and one independent variable and could potentially return a vertical line as its fit.
|