Content deleted Content added
AlbertChenYu (talk | contribs) |
AlbertChenYu (talk | contribs) |
||
Line 74:
Important examples include the [[sample variance]] and [[sample standard deviation]]. Without [[Bessel's correction]] (using the sample size ''n'' instead of the [[Degrees of freedom (statistics)|degrees of freedom]] ''n'' − 1), these are both negatively biased but consistent estimators. With the correction, the corrected sample variance is unbiased, while the corrected sample standard deviation is still biased, but less so, and both are still consistent: the correction factor converges to 1 as sample size grows.
Here is another example. Let <math>T_n</math> be a sequence of estimators for <math>\theta</math>
:<math>P(T_n) = \begin{cases}
1 - 1/n, & \mbox{if } T_n = \theta \\
1/n, & \mbox{if } T_n = n\delta
\end{cases}</math>
We can see that <math>T_n \xrightarrow{p} \theta</math>, <math>E[T_n] = \theta + \delta </math>, and the bias doesn't converge to zero.
|