Consistent estimator: Difference between revisions

Content deleted Content added
m copy edit with General fixes; url trimming of identifying info, replaced: … → ...!MOS:ELLIPSIS (2), typo(s) fixed: For example → For
Line 25:
=== Sample mean of a normal random variable ===
 
Suppose one has a sequence of observations {''X''<sub>1</sub>, ''X''<sub>2</sub>, ...} from a [[Normal distribution|normal ''N''(''μ'', ''σ''<sup>2</sup>)]] distribution. To estimate ''μ'' based on the first ''n'' observations, one can use the [[sample mean]]: ''T<sub>n</sub>''&nbsp;=&nbsp;(''X''<sub>1</sub> + ... + ''X<sub>n</sub>'')/''n''. This defines a sequence of estimators, indexed by the sample size ''n''.
 
From the properties of the normal distribution, we know the [[sampling distribution]] of this statistic: ''T''<sub>''n''</sub> is itself normally distributed, with mean ''μ'' and variance ''σ''<sup>2</sup>/''n''. Equivalently, <math style="vertical-align:-.3em">\scriptstyle (T_n-\mu)/(\sigma/\sqrt{n})</math> has a standard normal distribution:
Line 70:
 
=== Biased but consistent ===
Alternatively, an estimator can be biased but consistent. For example, if the mean is estimated by <math>{1 \over n} \sum x_i + {1 \over n}</math> it is biased, but as <math>n \rightarrow \infty</math>, it approaches the correct value, and so it is consistent.
 
Important examples include the [[sample variance]] and [[sample standard deviation]]. Without [[Bessel's correction]] (that is, when using the sample size ''n'' instead of the [[Degrees of freedom (statistics)|degrees of freedom]] ''n''&nbsp;−&nbsp;1), these are both negatively biased but consistent estimators. With the correction, the corrected sample variance is unbiased, while the corrected sample standard deviation is still biased, but less so, and both are still consistent: the correction factor converges to 1 as sample size grows.