Variational Bayesian methods: Difference between revisions

Content deleted Content added
Adding short description: "Mathematical methods used in Bayesian inference and machine learning" (Shortdesc helper)
KL divergence: fix: Q from P is consistent with Wikipedia's article on KL Divergence.
Tags: Mobile edit Mobile web edit
Line 25:
=== KL divergence ===
 
The most common type of variational Bayes uses the [[Kullback–Leibler divergence]] (KL-divergence) of ''PQ'' from ''QP'' as the choice of dissimilarity function. This choice makes this minimization tractable. The KL-divergence is defined as
 
:<math>D_{\mathrm{KL}}(Q \parallel P) \triangleq \sum_\mathbf{Z} Q(\mathbf{Z}) \log \frac{Q(\mathbf{Z})}{P(\mathbf{Z}\mid \mathbf{X})}.</math>