Content deleted Content added
→Intuition: Fixed notes about original Metropolis et al. (1953) paper that were incorrect. Boltzmann distribution was NOT the proposal distribution in the original Metropolis algorithm. The Metropolis algorithm produced samples from the Boltzmann distribution; its suggested (but arbitrary) proposal distribution was uniform. |
→Intuition: More clarifications about the differences between the MH generalization and the original. |
||
Line 25:
'''Metropolis algorithm (symmetric proposal distribution)'''
Let <math>f(x)</math> be a function that is proportional to the desired probability density function <math>P(x)</math> (a.k.a. a target distribution){{efn|In the original paper by Metropolis et al. (1953), <math>f</math> was taken to be the [[Boltzmann distribution]] as the specific application considered was [[Monte Carlo integration]] of [[equation of state|equations of state]] in [[physical chemistry]]; the extension by Hastings generalized to an arbitrary distribution <math>f</math>.}}.
# Initialization: Choose an arbitrary point <math>x_t</math> to be the first observation in the sample and choose an arbitrary probability density <math>g(x\mid y)</math> (sometimes written <math>Q(x\mid y)</math>) that suggests a candidate for the next sample value <math>x</math>, given the previous sample value <math>y</math>. In this section, <math>g</math> is assumed to be symmetric; in other words, it must satisfy <math>g(x\mid y) = g(y\mid x)</math>. A usual choice is to let <math>g(x\mid y)</math> be a [[Gaussian distribution]] centered at <math>y</math>, so that points closer to <math>y</math> are more likely to be visited next, making the sequence of samples into a [[random walk]]{{efn|In the original paper by Metropolis et al. (1953), <math>g(x\mid y)</math> was suggested to be a random translation with uniform density over some prescribed range.}}. The function <math>g</math> is referred to as the ''proposal density'' or ''jumping distribution''.
|