Content deleted Content added
→Computation: Use <math>-Tags to render formulas. Change "cross-correlation" to "correlation" as "cross-correlation" has a different meaning. |
→Computation: Better explanation of the symbols |
||
Line 8:
==Computation==
The square of the coefficient of multiple correlation can be computed using the [[Euclidean space|vector]] <math>\mathbf{c} = {(r_{x_1 y}, r_{x_2 y},\dots,r_{x_N y})}^\top</math> of [[correlation]]s <math>r_{x_n y}</math> between the predictor variables <math>x_n</math> (independent variables) and the target variable <math>y</math> (dependent variable), and the [[correlation matrix]] <math>R_{xx}</math> of inter-correlations between predictor variables. It is given by
::<math>R^2 = \mathbf{c}^\top R_{xx}^{-1}\, \mathbf{c}</math>,
where <math>\mathbf{c}^\top</math> is the [[transpose]] of <math>\mathbf{c}</math>, and <math>R_{xx}^{-1}</math> is the [[Matrix inversion|inverse]] of the matrix
::<math>R_{xx} = \left(\begin{array}{cccc}
r_{x_1 x_1} & r_{x_1 x_2} & \dots & r_{x_1 x_N} \\
r_{x_2 x_1} & \ddots & & \vdots \\
\vdots & & \ddots & \\
r_{x_N x_1} & \dots & & r_{x_N x_N}
\end{array}\right)</math>.
If all the predictor variables are uncorrelated, the matrix <math>R_{xx}</math> is the identity matrix and <math>R^2</math> simply equals <math>\mathbf{c}^\top\, \mathbf{c}</math>, the sum of the squared correlations with the dependent variable. If there the predictor variables are correlated among themselves, the inverse of the correlation matrix <math>R_{xx}</math> accounts for this.
|