Residual sum of squares: Difference between revisions

Content deleted Content added
Radagast83 (talk | contribs)
mNo edit summary
Yoderj (talk | contribs)
No edit summary
Line 1:
{{Mergeto|Least squares|date=July 2006}}
 
In [[statistics]], the '''residual sum of squares (RSS)''' is the [[sum]] of squares of [[errors and residuals in statistics|residuals]]. ,
 
:<math>RSS = \sum_{i=1}^n (y_i - f(x_i))^2. </math>
In a standard [[regression model]] <math>y_i = a+bx_i+\varepsilon_i\,</math>, where ''a'' and ''b'' are [[coefficient]]s, ''y'' and ''x'' are the [[regressand]] and the [[regressor]], respectively, and &epsilon; is the error term. The sum of squares of residuals is the sum of squares of [[estimator|estimates]] of &epsilon;<sub>''i''</sub>.
 
In a standard [[regression model]] <math>y_i = a+bx_i+\varepsilon_i\,</math>, where ''a'' and ''b'' are [[coefficient]]s, ''y'' and ''x'' are the [[regressand]] and the [[regressor]], respectively, and &epsilon; is the error term. The sum of squares of residuals is the sum of squares of [[estimator|estimates]] of &epsilon;<sub>''i''</sub>., that is
 
:<math>RSS = \sum_{i=1}^n (y_i - (a+bx_i))^2. </math>
 
In general: [[total sum of squares]] = [[explained sum of squares]] + '''residual sum of squares'''.