Simple linear regression: Difference between revisions

Content deleted Content added
Inference: Removed the Formula check statement
Line 13:
== Estimating the regression line ==
 
The parameters of the linear regression linemodel, <math>Y Y_i = a + bXbX_i + \varepsilon_i </math>, can be estimated using the method of [[ordinary least squares]]. This method finds the line that minimizes the sum of the squares of the regression residualserrors, <math> \sum_{i = 1}^Nn \hat{\varepsilon}_varepsilon_{i}^2 </math>. The residual is the difference between the observed value and the predicted value: <math> \hat{\varepsilon} _{i} = y_{i} - \hat{y}_{i} </math>
 
The minimization problem can be solved using calculus, producing the following formulas for the estimates of the regression parameters:
Line 22:
 
Ordinary least squares produces the following features:
# The line goes through the point <math> (\bar{Xx},\bar{Yy}) </math>.
# The sum of the residuals is equal to zero.
# The linear combination of the residuals in which the coefficients are the ''x''-values is equal to zero.