Coefficient of multiple correlation: Difference between revisions

Content deleted Content added
removing wholesale copyright violation by rewriting intro and deleting first titled section; already rewrote second titled section
remove confusion between positive and negative square root
Line 4:
 
==Fundamental equation of multiple regression analysis==
The coefficient of multiple determination ''R''<sup>2</sup> (a [[scalar (mathematics)|scalar]]), iscan be computed using the [[Euclidean space|vector]] ''c'' of cross-correlations[[correlation]]s between the predictor variables and the criterion variable, its [[transpose]]&nbsp;''c''', and the [[Matrix (mathematics)|matrix]] ''R''<sub>''xx''</sub> of inter-correlations between predictor variables. The "fundamental equation of multiple regression analysis"<ref>Visualstatistics.net [http://www.visualstatistics.net/Visual%20Statistics%20Multimedia/multiple_regression_analysis.htm]</ref> is
 
::''R''<sup>2</sup> = ''c''' ''R''<sub>''xx''</sub><sup>&minus;1</sup> ''c''.
The expression on the left side denotes the coefficient of multiple determination (the squared coefficient of multiple correlation). The terms on the right side are the transposed vector ''c'' ' of cross-correlations, the [[Matrix inversion|inverse]] of the matrix ''R''<sub>''xx''</sub> of inter-correlations, and the vector ''c'' of cross-correlations. The inverted matrix of the inter-correlations removes the redundant variance that results from the inter-correlations of the predictor variables. The square root of the resulting coefficient of multiple determination is the coefficient of multiple correlation ''R''. Note that if all the predictor variables are uncorrelated, the matrix ''R''<sub>''xx''</sub> is the identity matrix and ''R''<sup>2</sup> simply equals ''c''' ''c'', the sum of the squared cross-correlations. Otherwise, the inverted matrix of the inter-correlations removes the redundant variance that results from the inter-correlations of the predictor variables.