Simple linear regression: Difference between revisions

Content deleted Content added
No edit summary
Line 4:
{{Regression bar}}
 
In [[statistics]], '''simple linear regression''' ('''SLR''') is a [[linear regression]] model with a single [[covariate|explanatory variable]].<ref>{{cite book |last=Seltman |first=Howard J. |date=2008-09-08 |title=Experimental Design and Analysis |url=http://www.stat.cmu.edu/~hseltman/309/Book/Book.pdf |page=227}}</ref><ref name=":0">{{cite web |url=http://ci.columbia.edu/ci/premba_test/c0331/s7/s7_6.html |title=Statistical Sampling and Regression: Simple Linear Regression |publisher=Columbia University |access-date=2016-10-17 |quote=When one independent variable is used in a regression, it is called a simple regression;(...)}}</ref><ref>{{cite book |last=Lane |first=David M. |title=Introduction to Statistics |url=http://onlinestatbook.com/Online_Statistics_Education.pdf |page=462}}</ref><ref>{{Cite journal|last1=Zou KH|last2=Tuncali K|last3=Silverman SG|date=2003|title=Correlation and simple linear regression.|journal=Radiology|language=English|volume=227|issue=3|pages=617–22|issn=0033-8419|oclc=110941167|doi=10.1148/radiol.2273011499|pmid=12773666|url=https://repositorio.unal.edu.co/handle/unal/81200 }}</ref><ref>{{Cite journal|last1=Altman|first1=Naomi|last2=Krzywinski|first2=Martin|date=2015|title=Simple linear regression|journal=Nature Methods|language=English|volume=12|issue=11|pages=999–1000|issn=1548-7091|oclc=5912005539|doi=10.1038/nmeth.3627|pmid=26824102|s2cid=261269711 |doi-access=free}}</ref> That is, it concerns two-dimensional sample points with [[dependent and independent variables|one independent variable and one dependent variable]] (conventionally, the ''x'' and ''y'' coordinates in a [[Cartesian coordinate system]]) and finds a linear function (a non-vertical [[straight line]]) that, as accurately as possible, predicts the dependent variable values as a function of the independent variable.
The adjective ''simple'' refers to the fact that the outcome variable is related to a single predictor.
 
It is common to make the additional stipulation that the [[ordinary least squares]] (OLS) method should be used: the accuracy of each predicted value is measured by its squared ''[[errors and residuals|residual]]'' (vertical distance between the point of the data set and the fitted line), and the goal is to make the sum of these squared deviations as small as possible. There is an underlying assumption that only the dependent variable contains measurement error; if the explanatory variable is also measured with error, then simple regression is not appropriate for estimating the underlying relationship because it will be biased due to [[regression dilution]].
 
Other regression methods that can be used in place of ordinary least squares include [[least absolute deviations]] (minimizing the sum of absolute values of residuals) and the [[Theil–Sen estimator]] (which chooses a line whose [[slope]] is the [[median]] of the slopes determined by pairs of sample points). [[Deming regression]] (total least squares) also finds a line that fits a set of two-dimensional sample points, but (unlike ordinary least squares, least absolute deviations, and median slope regression) it is not really an instance of simple linear regression, because it does not separate the coordinates into one dependent and one independent variable and could potentially return a vertical line as its fit.
 
The remainder of the article assumes an ordinary least squares regression.
In this case, the slope of the fitted line is equal to the [[Pearson correlation coefficient|correlation]] between {{mvar|y}} and {{mvar|x}} corrected by the ratio of standard deviations of these variables. The intercept of the fitted line is such that the line passes through the center of mass {{math|({{overline|''x''}}, {{overline|''y''}})}} of the data points.
 
Line 261 ⟶ 257:
 
==Alternatives==
[[File:Fitting a straight line to a data with outliers.png|thumb|Calculating the parameters of a linear model by minimizing the squared error can lead to a model that attempts to fit the outliers more than the data.]]
 
In SLR, there is an underlying assumption that only the dependent variable contains measurement error; if the explanatory variable is also measured with error, then simple regression is not appropriate for estimating the underlying relationship because it will be biased due to [[regression dilution]].
 
Other estimation methods that can be used in place of ordinary least squares include [[least absolute deviations]] (minimizing the sum of absolute values of residuals) and the [[Theil–Sen estimator]] (which chooses a line whose [[slope]] is the [[median]] of the slopes determined by pairs of sample points).
 
Other regression methods that can be used in place of ordinary least squares include [[least absolute deviations]] (minimizing the sum of absolute values of residuals) and the [[Theil–Sen estimator]] (which chooses a line whose [[slope]] is the [[median]] of the slopes determined by pairs of sample points). [[Deming regression]] (total least squares) also finds a line that fits a set of two-dimensional sample points, but (unlike ordinary least squares, least absolute deviations, and median slope regression) it is not really an instance of simple linear regression, because it does not separate the coordinates into one dependent and one independent variable and could potentially return a vertical line as its fit. can lead to a model that attempts to fit the outliers more than the data.
 
=== Line fitting ===