Content deleted Content added
m added category "sampling techniques" |
How was this not in Category:Algorithms; copyeditting; changed pdf to lowercase; etc. |
||
Line 1:
[[Image:Metropolis_hastings_algorithm.png|thumb|350px|The Sampling [[probability distribution | distribution]]
In [[mathematics]] and [[physics]],
the '''Metropolis-Hastings algorithm''' is an [[algorithm]] to generate a sequence of samples from the [[joint distribution]] of two or more variables.
The purpose of such a sequence is to approximate the joint distribution (as with a histogram), or to compute an integral (such as an expected value).
This algorithm is an example of a [[Markov chain Monte Carlo]] algorithm. It is a generalization of the Metropolis algorithm suggested by Hastings (citation below).
The [[Gibbs sampling]] algorithm is a special case of the Metropolis-Hastings algorithm.
The Metropolis-Hastings algorithm can draw samples from any [[probability distribution]]
The algorithm generates a set of states The algorithm depends on the creation of a ''proposal density'' <math>Q(x'; x For example, the proposal density could be a [[Gaussian function]] centred on the current state :<math>
Line 13 ⟶ 17:
</math>
reading <math>Q(x'; x
This proposal density would generate samples centred around the current state with variance
So we draw a new proposal state <math>x'</math> with probability <math>Q(x'; x :<math>
Line 27 ⟶ 32:
</math>
is the likelihood ratio between the proposed sample <math>x'</math> and the previous sample
:<math>
Line 33 ⟶ 38:
</math>
is the ratio of the proposal density in two directions (from
This is equal to 1 if the proposal density is symmetric. Then the new state :<math>
x^{t+1}
x^{t+1}=\left\{\begin{matrix} x' & \mbox{if }a > 1 \\ x'\mbox{ with probability }a, & \mbox{if }a < 1 \end{matrix}\right.▼
=
\left
\{
\begin{matrix}
x' & \mbox{if }a > 1 \\
▲
\end{matrix}
\right.
</math>
The Markov chain is started from a random initial value
These samples, which are discarded, are known as The algorithm works best if the proposal density matches the shape of the target distribution If a Gaussian proposal is used the variance parameter This is usually done by calculating the ''acceptance rate'', which is the fraction of proposed samples that is accepted in a window of the last <math>N</math> samples. This is usually set to be around 60%. If the proposal steps are too small the chain will ''mix slowly'' (i.e., it will move around the space slowly and converge slowly to If the proposal steps are too large the acceptance rate will be very low because the proposals are likely to land in regions of much lower probability density so == References ==
* Chib, Siddhartha and Edward Greenberg: "Understanding the Metropolis–Hastings Algorithm". American Statistician, 49, 327–335, 1995
* W.K. Hastings. "Monte Carlo Sampling Methods Using Markov Chains and Their Applications", ''Biometrika'', 57:97-109, 1970.
* N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.H. Teller, and E. Teller. "Equations of State Calculations by Fast Computing Machines". ''Journal of Chemical Physics'', 21:1087-1091, 1953.
[[Category:Algorithms]]
[[Category:Probability and statistics]] [[Category:Physics]] [[Category:Sampling techniques]] |