Flow-based generative model: Difference between revisions

Content deleted Content added
TommyX12 (talk | contribs)
No edit summary
TommyX12 (talk | contribs)
No edit summary
Line 3:
{{Machine learning bar}}
 
A '''flow-based generative model''' is a [[generative model]] used in [[machine learning]] that explicitly models a [[probability distribution]] by leveraging '''normalizing flow'''<ref>{{cite arXiv | eprint=1505.05770}}</ref>, which is a statistical method of using the [[Probability density function#Function of random variables and change of variables in the probability density function|change-of-variable]] law of probabilities to transform a simple distribution into a complex one, which is usually the distribution (i.e. likelihood function) of observed data, <math>p(\textbf{x})</math>.
 
The direct modeling of likelihood provides many advantages. For example, the negative log-likelihood can be directly computed and minimized as the [[loss function]]. Additionally, novel samples can be generated by sampling from the initial distribution, and applying the flow transformation.
Line 10:
 
== Method ==
 
Let <math>z_0</math> be a (possibly multivariate) [[random variable]] with distribution <math>p_0(z_0)</math>.
 
Let <math>z_1 = f_1(z_0)</math> be a random variable which is a function of <math>z_0</math>. The function <math>f_1</math> should be invertible, i.e. the [[inverse function]] <math>f^{-1}_1</math> exists.
 
By the [[Probability density function#Function of random variables and change of variables in the probability density function|change of variable]] formula, the distribution of <math>z_1</math> is:
 
: <math>p_1(z_1) = p_0(f^{-1}_1(z_1))\left|\det \frac{df_1^{-1}}{dz_1}\right|</math>
 
== Examples ==
Line 32 ⟶ 40:
== External links ==
* [https://lilianweng.github.io/lil-log/2018/10/13/flow-based-deep-generative-models.html Flow-based Deep Generative Models]
 
TODO
 
[[:Category:Machine learning]]