Mean squared prediction error: Difference between revisions

Content deleted Content added
No edit summary
No edit summary
Line 4:
}}
 
In [[statistics]] the '''mean squared prediction error''' or('''MSPE'''), also known as '''mean squared error of the predictions''', of a [[smoothing]] or, [[curve fitting]], or [[regression (statistics)|regression]] procedure is the expected value of the [[squared deviation|square difference]] between the fitted values implied by the predictive function <math>\widehat{g}</math> and the values of the (unobservable) function[[true value]] ''g''. It is an inverse measure of the [[explanatory power]] of <math>\widehat{g},</math> and can be used in the process of [[cross-validation (statistics)|cross-validation]] of an estimated model.
 
If the smoothing or fitting procedure has [[projection matrix]] (i.e., hat matrix) ''L'', which maps the observed values vector <math>y</math> to [[predicted valuesvalue]]s vector <math>\hat{y}=Ly,</math> then
 
:<math>\operatorname{MSPE}(L)=\operatorname{E}\left[\left( g(x_i)-\widehat{g}(x_i)\right)^2\right].</math>
 
The MSPE can be decomposed into two terms: the meansquared of[[bias squared biases(statistics)|bias]] of the fitted values and the mean of variances[[variance]]s of the fitted values:
 
:<math>\operatorname{SSPE}(L)=n\cdot\operatorname{MSPE}(L)=\sum_{i=1}^n\left(\operatorname{E}\left[\widehat{g}(x_i)\right]-g(x_i)\right)^2+\sum_{i=1}^n\operatorname{var}\left[\widehat{g}(x_i)\right].</math>
 
The quantity {{math|SSPE{{=}}''n''MSPE}} is called '''sum squared prediction error''' ('''SSPE''').
The '''root mean squared prediction error''' ('''RMSPE''') is the square root of MSPE: {{math|RMSPE{{=}}{{sqrt|MSPE}}}}.
 
Knowledge of ''g'' is required in order to calculate the MSPE exactly; otherwise, it can be estimated.