Variational autoencoder: Difference between revisions

Content deleted Content added
AnomieBOT (talk | contribs)
m Dating maintenance tags: {{Clarify}}
Line 86:
\mathbb {E}_{\epsilon}\left[ \nabla_\phi \ln {\frac {p_{\theta }(x, \mu_\phi(x) + L_\phi(x)\epsilon)}{q_{\phi }(\mu_\phi(x) + L_\phi(x)\epsilon | x)}}\right] </math>and so we obtained an unbiased estimator of the gradient, allowing [[stochastic gradient descent]].
 
Since we reparametrized <math>z</math>, we need to find <math>q_\phi(z|x)</math>. Let <math>q_0</math> be the probability density function for <math>\epsilon</math>, then {{clarify |reason=The following calculations might have mistakes.|date=October 2023; I interchanged z and epsilon in the definition of the Jacobian below | date January 2025}}<math display="block">\ln q_\phi(z | x) = \ln q_0 (\epsilon) - \ln|\det(\partial_\epsilon z)|</math>where <math>\partial_\epsilon z</math> is the Jacobian matrix of <math>z</math> with respect to <math>\epsilon</math>. Since <math>z = \mu_\phi(x) + L_\phi(x)\epsilon </math>, this is <math display="block">\ln q_\phi(z | x) = -\frac 12 \|\epsilon\|^2 - \ln|\det L_\phi(x)| - \frac n2 \ln(2\pi)</math>
 
== Variations ==