Deming regression: Difference between revisions

Content deleted Content added
m grammar
Line 1:
{{Short description|Algorithm for the line of best fit for a two-dimensional dataset}}
[[Image:Total least squares.svg|thumb|Deming regression. The red lines show the error in both ''x'' and ''y''. This is different from the traditional least squares method, which measures error parallel to the ''y'' axis. The case shown, with deviations measured perpendicularly, arises when errors in ''x'' and ''y'' have equal variances.]]
 
In [[statistics]], '''Deming regression''', named after [[W. Edwards Deming]], is an [[errors-in-variables model]] whichthat tries to find the [[line of best fit]] for a two-dimensional dataset. It differs from the [[simple linear regression]] in that it accounts for [[errors and residuals in statistics|errors]] in observations on both the ''x''- and the ''y''- axis. It is a special case of [[total least squares]], which allows for any number of predictors and a more complicated error structure.
 
Deming regression is equivalent to the [[maximum likelihood]] estimation of an [[errors-in-variables model]] in which the errors for the two variables are assumed to be independent and [[normal distribution|normally distributed]], and the ratio of their variances, denoted ''δ'', is known.{{sfn|Linnet|1993}} In practice, this ratio might be estimated from related data-sources; however the regression procedure takes no account for possible errors in estimating this ratio.