Content deleted Content added
No edit summary |
remove editorial comments, remove link to dead url, replace external link with detail of published version, remove circular redirect, reduce duplication |
||
Line 1:
{{More footnotes|date=November 2010}}
In [[statistics]], the coefficient of '''multiple correlation''' is a measure of how well a given variable can be predicted using a linear
==Definition==
Unlike the [[coefficient of determination]] in a regression involving just two variables, the coefficient of multiple determination is not computationally [[commutative]]: a regression of ''y'' on ''x'' and ''z'' will in general have a different R<sup>2</sup> than will a regression of ''z'' on ''x'' and ''y''. For example, suppose that in a particular sample the variable ''z'' is [[Correlation and dependence|uncorrelated]] with both ''x'' and ''y'', while ''x'' and ''y'' are linearly related to each other. Then a regression of ''z'' on ''y'' and ''x'' will yield an R<sup>2</sup> of zero, while a regression of ''y'' on ''x'' and ''z'' will yield a positive R<sup>2</sup>.▼
The coefficient of multiple determination ''R''<sup>2</sup> (a [[scalar (mathematics)|scalar]]), can be computed using the [[Euclidean space|vector]] ''c'' of cross-[[correlation]]s
::''R''<sup>2</sup> = ''c''' ''R''<sub>''xx''</sub><sup>−1</sup> ''c'',
▲The coefficient of multiple determination ''R''<sup>2</sup> (a [[scalar (mathematics)|scalar]]), can be computed using the [[Euclidean space|vector]] ''c'' of cross-[[correlation]]s (i.e. [[covariance]]s NO, THIS IS NOT CORRECT (NOT COVARIANCES). ALSO, THE CITED SOURCE NO LONGER EXISTS.) between the predictor variables and the criterion variable, its [[transpose]] ''c''', and the [[Matrix (mathematics)|matrix]] ''R''<sub>''xx''</sub> of inter-correlations between predictor variables. The "fundamental equation of multiple regression analysis"<ref>[http://www.visualstatistics.net/Visual%20Statistics%20Multimedia/multiple_regression_analysis.htm Visualstatistics.net]</ref> is
==Properties==
▲Unlike the [[coefficient of determination]] in a regression involving just two variables, the coefficient of multiple determination is not computationally [[commutative]]: a regression of ''y'' on ''x'' and ''z'' will in general have a different R<sup>2</sup> than will a regression of ''z'' on ''x'' and ''y''. For example, suppose that in a particular sample the variable ''z'' is [[Correlation and dependence|uncorrelated]] with both ''x'' and ''y'', while ''x'' and ''y'' are linearly related to each other. Then a regression of ''z'' on ''y'' and ''x'' will yield an R<sup>2</sup> of zero, while a regression of ''y'' on ''x'' and ''z'' will yield a positive R<sup>2</sup>.
==References==
{{Reflist}}
* Allison, Paul D.
* Cohen, Jacob, et al. (2002) ''Applied Multiple Regression - Correlation Analysis for the Behavioral Sciences''
* Crown, William H. (1998) ''Statistical Models for the Social and Behavioral Sciences: Multiple Regression and Limited-Dependent Variable Models''
* Edwards, Allen Louis
* Timothy Z. Keith. (2005) '' Multiple Regression and Beyond''
* Fred N. Kerlinger, Elazar J. Pedhazur
* Stanton, Jeffrey M. (2001) [http://www.amstat.org/publications/jse/v9n3/stanton.html "Galton, Pearson, and the Peas: A Brief History of Linear Regression
▲* [http://www.amstat.org/publications/jse/v9n3/stanton.html A Brief History of Linear Regression Analysis]
{{DEFAULTSORT:Multiple Correlation}}
|