TheWith more than two variables being related to each other, the value of the coefficient of multiple correlation isdepends noton the choice of computationallydependent [[commutative]]variable: a regression of ''y'' on ''x'' and ''z'' will in general have a different R<sup>2</sup> than will a regression of ''z'' on ''x'' and ''y''. For example, suppose that in a particular sample the variable ''z'' is [[Correlation and dependence|uncorrelated]] with both ''x'' and ''y'', while ''x'' and ''y'' are linearly related to each other. Then a regression of ''z'' on ''y'' and ''x'' will yield an R<sup>2</sup> of zero, while a regression of ''y'' on ''x'' and ''z'' will yield a strictly positive R<sup>2</sup>. This follows since the correlation of ''y'' with the best predictor based on ''x'' and ''z'' is in all cases at last as large as the correlation of ''y'' with the best predictor based on ''x'' alone, and in this case with ''z'' providing no explanatory power it will be exactly as large.