Content deleted Content added
m Expectation–maximization w/hyphen including in section title |
m EM is also used for ML estimation |
||
Line 9:
In the former purpose (that of approximating a posterior probability), variational Bayes is an alternative to [[Monte Carlo sampling]] methods—particularly, [[Markov chain Monte Carlo]] methods such as [[Gibbs sampling]]—for taking a fully Bayesian approach to [[statistical inference]] over complex [[probability distribution|distributions]] that are difficult to evaluate directly or [[sample (statistics)|sample]]. In particular, whereas Monte Carlo techniques provide a numerical approximation to the exact posterior using a set of samples, variational Bayes provides a locally-optimal, exact analytical solution to an approximation of the posterior.
Variational Bayes can be seen as an extension of the [[expectation–maximization algorithm|expectation–maximization]] (EM) algorithm from [[maximum likelihood estimation|maximum likelihood]] (ML) or [[maximum a posteriori estimation|maximum a posteriori]] (MAP) estimation
For many applications, variational Bayes produces solutions of comparable accuracy to Gibbs sampling at greater speed. However, deriving the set of equations used to update the parameters iteratively often requires a large amount of work compared with deriving the comparable Gibbs sampling equations. This is the case even for many models that are conceptually quite simple, as is demonstrated below in the case of a basic non-hierarchical model with only two parameters and no latent variables.
|