Metropolis–Hastings algorithm: Difference between revisions

Content deleted Content added
History: this OR is clearly incorrect. see eg Simple condition for the convergence of the Gibbs sampler and Metropolis Hastings algorithm GO Robert, AFM Smith - Stochastic Processes and Their Applications, 1994
Tag: references removed
Line 143:
[[Image:3dRosenbrock.png|thumb|The result of three [[Markov chain]]s running on the 3D [[Rosenbrock function]] using the Metropolis–Hastings algorithm. The algorithm samples from regions where the [[posterior probability]] is high, and the chains begin to mix in these regions. The approximate position of the maximum has been illuminated. The red points are the ones that remain after the burn-in process. The earlier ones have been discarded.]]
 
== OptimizationBayesian Inference ==
{{main article|Bayesian Inference}}
[[File:Flowchart-of-Metropolis-Hastings-M-H-algorithm-for-the-parameter-estimation-using-the.png|thumb|Flowchart of Metropolis-Hastings (M-H) algorithm for the parameter estimation using the Markov Chain Monte Carlo (MCMC) approach.]]
MCMC can be used to estimate optimal parameters of a statistical model.
The acceptance probability is given by:
<math>P_{acc}(\theta_i \to \theta^*)=\min\left(1, \frac{\mathcal{L}(y|\theta^*)P(\theta^*)}{\mathcal{L}(y|\theta_i)P(\theta_i)}\frac{Q(\theta_i|\theta^*)}{Q(\theta^*|\theta_i)}\right),</math>
where <math>\mathcal{L}</math> is the [[likelihood]], <math>P(\theta)</math> the prior probability density and <math>Q</math> the (conditional) proposal probability.
 
==See also==