Content deleted Content added
→Method: Added |
|||
Line 22:
Learning probability distributions by differentiating log Jacobians originated in the Infomax (maximum likelihood) approach to ICA,<ref>Bell, A. J.; Sejnowski, T. J. (1995). "[https://doi.org/10.1162/neco.1995.7.6.1129 An information-maximization approach to blind separation and blind deconvolution]". ''Neural Computation''. **7** (6): 1129–1159. doi:10.1162/neco.1995.7.6.1129.</ref> which forms a single-layer (K=1) flow-based model. Relatedly, the single layer precursor of conditional generative flows appeared in <ref>Roth, Z.; Baram, Y. (1996). "[https://doi.org/10.1109/72.536322 Multidimensional density shaping by sigmoids]". ''IEEE Transactions on Neural Networks''. **7** (5): 1291–1298. doi:10.1109/72.536322.</ref>.
To efficiently compute the log likelihood, the functions <math>f_1, ..., f_K</math> should be easily invertible, and the determinants of their Jacobians should be simple to compute. In practice, the functions <math>f_1, ..., f_K</math> are modeled using [[Deep learning|deep neural networks]], and are trained to minimize the negative log-likelihood of data samples from the target distribution. These architectures are usually designed such that only the forward pass of the neural network is required in both the inverse and the Jacobian determinant calculations. Examples of such architectures include NICE,<ref name=":1">{{cite arXiv | eprint=1410.8516| last1=Dinh| first1=Laurent| last2=Krueger| first2=David| last3=Bengio| first3=Yoshua| title=NICE: Non-linear Independent Components Estimation| year=2014| class=cs.LG}}</ref> RealNVP,<ref name=":2">{{cite arXiv | eprint=1605.08803| last1=Dinh| first1=Laurent| last2=Sohl-Dickstein| first2=Jascha| last3=Bengio| first3=Samy| title=Density estimation using Real NVP| year=2016| class=cs.LG}}</ref> and Glow.<ref name="glow">{{cite arXiv | eprint=1807.03039| last1=Kingma| first1=Diederik P.| last2=Dhariwal| first2=Prafulla| title=Glow: Generative Flow with Invertible 1x1 Convolutions| year=2018| class=stat.ML}}</ref>
|