Content deleted Content added
created a stub |
|||
Line 1:
In [[econometrics]], the '''information matrix test''' is used to determine whether a [[regression model]] is [[Specification (regression)|misspecified]]. The test was developed by [[Halbert White]],<ref>{{Cite journal |last1=White|first1=Halbert|title=Maximum Likelihood Estimation of Misspecified Models |journal=[[Econometrica]] |date=1982 |volume=50 |issue=1 |pages=1–25 |jstor=1912526 }}</ref> who observed that in a correctly specified model and under standard regularity assumptions, the [[Fisher information|information matrix]] can be expressed in either of two ways: as the [[outer product]] of the [[gradient]], or as a function of the [[Hessian matrix]] of the log-likelihood function.
Consider a linear model <math>\mathbf{y} = \mathbf{X} \mathbf{\beta} + \mathbf{u}</math>, where the errors <math>\mathbf{u}</math> are assumed to be distributed <math>\mathrm{N} \left( 0, \sigma^
:<math>\
The information matrix can then be expressed as
:<math>\mathbf{I} \left( \mathbf{\theta} \right) = \mathbb{E} \left[ \left( \frac{\partial \
that is the expected value of the outer product of the gradient or [[Score (statistics)|score]]. Second, it can be written as the negative of the Hessian matrix of the log-likelihood function
:<math>\mathbf{I}
If the model is correctly specified, both expressions should be equal. Combining the equivalent forms yields
:<math>\mathbf{\Delta}(\mathbf{\theta}) = \sum_{i=1}^n \left[ \frac{\partial^2 \ell(\mathbf{\theta}) }{ \partial \mathbf{\theta} \, \partial \mathbf{\theta}^{\mathsf{T}} } + \frac{\partial \ell(\mathbf{\theta}) }{ \partial \mathbf{\theta} } \frac{\partial \ell (\mathbf{\theta}) }{ \partial \mathbf{\theta} } \right]</math>
where <math>\mathbf{\Delta} \left( \mathbf{\theta} \right)</math> is an <math>(r \times r)</math> [[random matrix]], where <math>r</math> is the number of parameters. White showed that the elements of <math>n^{- == References ==
|