Consistent estimator: Difference between revisions

Content deleted Content added
removed grandparent category of Category:Estimator
No edit summary
Tags: Mobile edit Mobile web edit
Line 1:
{{broader|Consistency (statistics)}}
 
[[Image:Consistency of estimator.svg|thumb|250px|{''T''<sub>1</sub>, ''T''<sub>2</sub>, ''T''<sub>3</sub>, …} is a sequence of estimators for parameter ''θ''<sub>0</sub>, the true value of which is 4. This sequence is consistent: the estimators are gettinggettin more and more concentrated near the true value ''θ''<sub>0</sub>; at the same time, these estimators are biased. The limiting distribution of the sequence is a degenerate random variable which equals ''θ''<sub>0</sub> with probability 1.]]
 
In [[statistics]], a '''consistent estimator''' or '''asymptotically consistent estimator''' is an [[estimator]]—a rule for computing estimates of a parameter ''θ''<sub>0</sub>—having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates [[convergence in probability|converges in probability]] to ''θ''<sub>0</sub>. This means that the distributions of the estimates become more and more concentrated near the true value of the parameter being estimated, so that the probability of the estimator being arbitrarily close to ''θ''<sub>0</sub> converges to one.