Variational autoencoder: Difference between revisions

Content deleted Content added
rmv redlink hatnote WP:REDHAT
Line 60:
 
== Reparameterization ==
[[File:Reparameterization Trick.png|thumb|300x300px|The scheme of the reparameterization trick. The randomness variable <math>{\varepsilon}</math> is injected into the latent space <math>z</math> as external input. In this way, it is possible to backpropagate the gradient without involving stochastic variable during the update.]]{{Main|Reparametrization trick}}
To efficient search for <math display="block">\theta^*,\phi^* = \underset{\theta,\phi}\operatorname{arg max} \, L_{\theta,\phi}(x) </math>the typical method is [[gradient descent]].