Content deleted Content added
broader context |
simplified inline math, added tags, fixed wikilink |
||
Line 1:
{{Unreferenced|date=December 2009}}
In [[statistics]] the '''mean squared prediction error''' of a [[smoothing]] or [[curve fitting]] procedure is the expected sum of squared deviations of the fitted values <math>\widehat{g}</math> from the (unobservable) function
:<math>\operatorname{MSPE}(L)=\operatorname{E}\left[\sum_{i=1}^n\left( g(x_i)-\widehat{g}(x_i)\right)^2\right].</math>
Line 8:
:<math>\operatorname{MSPE}(L)=\sum_{i=1}^n\left(\operatorname{E}\left[\widehat{g}(x_i)\right]-g(x_i)\right)^2+\sum_{i=1}^n\operatorname{var}\left[\widehat{g}(x_i)\right].</math>
Note that knowledge of
==Estimation of MSPE==
Line 28:
:<math>\operatorname{\widehat{MSPE}}(L)=\sum_{i=1}^n\left(y_i-\widehat{g}(x_i)\right)^2-\widehat{\sigma}^2\left(n-2\operatorname{tr}\left[L\right]\right).</math>
[[Colin Mallows]] advocated this method in the construction of his model selection statistic [[Mallows's Cp|
:<math>C_p=\frac{\sum_{i=1}^n\left(y_i-\widehat{g}(x_i)\right)^2}{\widehat{\sigma}^2}-n+2\operatorname{tr}\left[L\right].</math>
where
==See also==
|