Content deleted Content added
Citation bot (talk | contribs) Alter: url, chapter-url. URLs might have been anonymized. Add: doi, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by AManWithNoPlan | #UCB_CommandLine |
mNo edit summary |
||
Line 1:
In [[econometrics]], the '''information matrix test''' is used to determine whether a [[regression model]] is [[Statistical model specification|misspecified]]. The test was developed by [[Halbert White]],<ref>{{Cite journal |last1=White|first1=Halbert|title=Maximum Likelihood Estimation of Misspecified Models |journal=[[Econometrica]] |date=1982 |volume=50 |issue=1 |pages=1–25 |doi=10.2307/1912526 |jstor=1912526 }}</ref> who observed that in a correctly specified model and under standard regularity assumptions, the [[Fisher information matrix]] can be expressed in either of two ways: as the [[outer product]] of the [[gradient]] of the log-likelihood function, or as a function of
Consider a linear model <math>\mathbf{y} = \mathbf{X} \mathbf{\beta} + \mathbf{u}</math>, where the errors <math>\mathbf{u}</math> are assumed to be distributed <math>\mathrm{N}(0, \sigma^2 \mathbf{I})</math>. If the parameters <math>\beta</math> and <math>\sigma^2</math> are stacked in the vector <math>\mathbf{\theta}^{\mathsf{T}} = \begin{bmatrix} \beta & \sigma^2 \end{bmatrix}</math>, the resulting [[Likelihood function|log-likelihood function]] is
|