Content deleted Content added
m →top |
|||
Line 7:
#To derive a [[lower bound]] for the [[marginal likelihood]] (sometimes called the ''evidence'') of the observed data (i.e. the [[marginal probability]] of the data given the model, with marginalization performed over unobserved variables). This is typically used for performing [[model selection]], the general idea being that a higher marginal likelihood for a given model indicates a better fit of the data by that model and hence a greater probability that the model in question was the one that generated the data. (See also the [[Bayes factor]] article.)
In the former purpose (that of approximating a posterior probability), variational Bayes is an alternative to [[Monte Carlo sampling]]
Variational Bayes can be seen as an extension of the
For many applications, variational Bayes produces solutions of comparable accuracy to Gibbs sampling at greater speed. However, deriving the set of equations used to update the parameters iteratively often requires a large amount of work compared with deriving the comparable Gibbs sampling equations. This is the case even for many models that are conceptually quite simple, as is demonstrated below in the case of a basic non-hierarchical model with only two parameters and no latent variables.
|