Coefficient of multiple correlation: Difference between revisions

Content deleted Content added
Line 27:
If all the predictor variables are uncorrelated, the matrix <math>R_{xx}</math> is the identity matrix and <math>R^2</math> simply equals <math>\mathbf{c}^\top\, \mathbf{c}</math>, the sum of the squared correlations with the dependent variable. If the predictor variables are correlated among themselves, the inverse of the correlation matrix <math>R_{xx}</math> accounts for this.
 
The squared coefficient of multiple correlation can also be computed as the fraction of variance of the dependent variable that is explained by the independent variables, which in turn is 1 minus the unexplained fraction. The unexplained fraction can be computed as the [[sum of squaredsquares of residuals]]&mdash;that is, the sum of the squares of the prediction errors&mdash;divided by the [[Total sum of squares|sum of thesquares squaredof deviations of the values of the dependent variable]] from its [[expected value]].
 
==Properties==