Content deleted Content added
m Cleaned up using AutoEd |
log likelihood isn Tag: Reverted |
||
Line 3:
A '''flow-based generative model''' is a [[generative model]] used in [[machine learning]] that explicitly models a [[probability distribution]] by leveraging '''normalizing flow''',<ref>{{cite arXiv | eprint=1505.05770| author1=Danilo Jimenez Rezende| last2=Mohamed| first2=Shakir| title=Variational Inference with Normalizing Flows| year=2015| class=stat.ML}}</ref> which is a statistical method using the [[Probability density function#Function of random variables and change of variables in the probability density function|change-of-variable]] law of probabilities to transform a simple distribution into a complex one.
The direct modeling of likelihood provides many advantages. For example, the negative log-likelihood can be directly computed and
In contrast, many alternative generative modeling methods such as [[Autoencoder#Variational autoencoder (VAE)|variational autoencoder (VAE)]] and [[generative adversarial network]] do not explicitly represent the likelihood function.
|