Content deleted Content added
Fgnievinski (talk | contribs) No edit summary |
Fgnievinski (talk | contribs) No edit summary |
||
Line 4:
}}
In [[statistics]] the '''mean squared prediction error'''
If the smoothing or fitting procedure has [[projection matrix]] (i.e., hat matrix) ''L'', which maps the observed values vector <math>y</math> to [[predicted
:<math>\operatorname{MSPE}(L)=\operatorname{E}\left[\left( g(x_i)-\widehat{g}(x_i)\right)^2\right].</math>
The MSPE can be decomposed into two terms: the
:<math>\operatorname{SSPE}(L)=n\cdot\operatorname{MSPE}(L)=\sum_{i=1}^n\left(\operatorname{E}\left[\widehat{g}(x_i)\right]-g(x_i)\right)^2+\sum_{i=1}^n\operatorname{var}\left[\widehat{g}(x_i)\right].</math>
The quantity {{math|SSPE{{=}}''n''MSPE}} is called '''sum squared prediction error''' ('''SSPE''').
The '''root mean squared prediction error''' ('''RMSPE''') is the square root of MSPE: {{math|RMSPE{{=}}{{sqrt|MSPE}}}}.
Knowledge of ''g'' is required in order to calculate the MSPE exactly; otherwise, it can be estimated.
|