Content deleted Content added
Added continuous normalizing flow |
Citation bot (talk | contribs) Add: class, year, title, authors 1-5. | Use this bot. Report bugs. | Suggested by AManWithNoPlan | via #UCB_toolbar |
||
Line 49:
=== Continuous Normalizing Flow (CNF) ===
Instead of constructing flow by function composition, another approach is to formulate the flow as a continuous-time dynamic.<ref name="ffjord">{{cite arXiv | eprint=1810.01367| last1=Grathwohl| first1=Will| last2=Chen| first2=Ricky T. Q.| last3=Bettencourt| first3=Jesse| last4=Sutskever| first4=Ilya| last5=Duvenaud| first5=David| title=FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models| year=2018| class=cs.LG}}</ref> Let <math>z_0</math> be the latent variable with distribution <math>p(z_0)</math>. Map this latent variable to data space with the following flow function:
: <math>x = F(z_0) = z_T = z_0 + \int_0^t f(z_t, t) dt</math>
Line 63:
: <math>\log(p(x)) = \log(p(z_0)) - \int_0^t \text{Tr}\left[\frac{\partial f}{\partial z_t} dt\right]</math>
Because of the use of integration, techniques such as Neural ODE <ref>{{cite arXiv | eprint=1806.07366| last1=Chen| first1=Ricky T. Q.| last2=Rubanova| first2=Yulia| last3=Bettencourt| first3=Jesse| last4=Duvenaud| first4=David| title=Neural Ordinary Differential Equations| year=2018| class=cs.LG}}</ref> may be needed in practice.
== Applications ==
|