Formation matrix: Difference between revisions

Content deleted Content added
Adding may be too technical tag
revise for general audience
 
(4 intermediate revisions by 3 users not shown)
Line 1:
{{short description|Concept in statistical inference}}
{{technical|date=May 2014}}
 
In [[statistics]] and [[information theory]], the '''expected formation matrix''' ofand the '''observed formation matrix''' are concepts used to quantify the uncertainty associated with parameter estimates derived from a [[likelihood function]] <math>L(\theta)</math>. isThey are the matrix inverseinverses of the [[Fisher information matrix]] of <math>L(\theta)</math>, while the '''observed formation matrix''' of <math>L(\theta)</math> is the inverse ofand the [[observed information matrix]], of <math>L(\theta)</math>respectively.<ref>Edwards (1984) p104</ref>
 
Because Fisher information measures the amount of information that an observable [[random variable]] carries about an unknown [[parameter]] <math>\theta</math>, its inverse represents a measure of the dispersion or variance for an [[estimator]] of <math>\theta</math>. The formation matrix is therefore related to the [[covariance matrix]] of an estimator and is central to the [[Cramér–Rao bound]], which establishes a lower bound on the variance of unbiased estimators. These matrices appear naturally in the [[asymptotic expansion]] of the distribution of many statistics related to the [[Likelihood-ratio test|likelihood ratio]].
Currently, no notation for dealing with formation matrices is widely used, but in books and articles by [[Ole E. Barndorff-Nielsen]] and [[Peter McCullagh]], the symbol <math>j^{ij}</math> is used to denote the element of the i-th line and j-th column of the observed formation matrix. The [[Information Geometry|geometric interpretation]] of the Fisher information matrix (metric) leads to a notation of <math>g^{ij}</math> following the notation of the ([[Covariance and contravariance of vectors|contravariant]]) metric tensor in [[differential geometry]]. The Fisher information metric is denoted by <math>g_{ij}</math> so that using [[Einstein notation]] we have <math> g_{ik}g^{kj} = \delta_i^j</math>.
 
Currently, no single notation for dealing with formation matrices is widelyuniversally used,. but in books andIn articlesworks by [[Ole E. Barndorff-Nielsen]] and [[Peter McCullagh]], the symbol <math>j^{ij}</math> is used to denotedenotes the element ofin the i-th linerow and j-th column of the observed formation matrix. TheAn [[Information Geometry|geometric interpretation]] of the Fisher information matrix (metric) leads to aalternative notation of, <math>g^{ij}</math>, followingarises the notation offrom the ([[CovarianceInformation andGeometry|geometric contravarianceinterpretation]] of vectors|contravariant]])the metricFisher tensorinformation inmatrix as a [[differentialmetric geometrytensor]]. The Fisher information metric is, denoted by <math>g_{ij}</math>. so that usingFollowing [[Einstein notation]], wethese haveare related by <math> g_{ik}g^{kj} = \delta_i^j</math>.
These matrices appear naturally in the [[asymptotic expansion]] of the distribution of many statistics related to the [[Likelihood-ratio test|likelihood ratio]].
 
== See also ==
Line 15 ⟶ 16:
 
== References ==
*Barndorff-Nielsen, O.E., Cox, D.R. (1989), Asymptotic Techniques for Use in Statistics, Chapman and Hall, London. {{ISBN |0-412-31400-2}}
*Barndorff-Nielsen, O.E., Cox, D.R., (1994). Inference and Asymptotics. Chapman & Hall, London.
*P. McCullagh, "Tensor Methods in Statistics", Monographs on Statistics and Applied Probability, Chapman and Hall, 1987.
*Edwards, A.W.F. (1984) ''Likelihood''. CUP. {{ISBN |0-521-31871-8}}
 
[[Category:Estimation theory]]
[[Category:Information theory]]
[[Category:Statistical terminology]]
 
{{statistics-stub}}
[[Category:Statistical terminologyinference]]