Differentiable programming: Difference between revisions

Content deleted Content added
Citation bot (talk | contribs)
Alter: title, template type. Add: chapter-url, chapter. Removed or converted URL. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by First Comet | #UCB_toolbar
Digital27 (talk | contribs)
clarified NumPy's autograd package.
Line 9:
* ''' Static, [[compiled]] graph'''-based approaches such as [[TensorFlow]],<ref group=note>TensorFlow 1 uses the static graph approach, whereas TensorFlow 2 uses the dynamic graph approach by default.</ref> [[Theano (software)|Theano]], and [[MXNet]]. They tend to allow for good [[compiler optimization]] and easier scaling to large systems, but their static nature limits interactivity and the types of programs that can be created easily (e.g. those involving [[loop (computing)|loops]] or [[recursion]]), as well as making it harder for users to reason effectively about their programs.<ref name="flux" /> A proof of concept compiler toolchain called Myia uses a subset of Python as a front end and supports higher-order functions, recursion, and higher-order derivatives.<ref>{{cite book |last1=Merriënboer |first1=Bart van |last2=Breuleux |first2=Olivier |last3=Bergeron |first3=Arnaud |last4=Lamblin |first4=Pascal |chapter=Automatic differentiation in ML: where we are and where we should be going |title={{harvnb|NIPS'18}} |date=3 December 2018 |volume=31 |pages=8771–81 |chapter-url = https://papers.nips.cc/paper/2018/hash/770f8e448d07586afbf77bb59f698587-Abstract.html}}</ref><ref name="myia1">{{Cite web |last1=Breuleux |first1=O. |last2=van Merriënboer |first2=B. |date=2017 |url=https://www.sysml.cc/doc/2018/39.pdf |title=Automatic Differentiation in Myia |access-date=2019-06-24}}</ref><ref name="pytorchtut">{{Cite web|url=https://pytorch.org/tutorials/beginner/examples_autograd/tf_two_layer_net.html |title=TensorFlow: Static Graphs |work=Tutorials: Learning PyTorch |publisher=PyTorch.org |access-date=2019-03-04}}</ref>
 
* '''[[Operator overloading]], dynamic graph''' based approaches such as [[PyTorch]] and [[AutoGrad (NumPy)|AutoGrad]]'s autograd package. Their dynamic and interactive nature lets most programs be written and reasoned about more easily. However, they lead to [[interpreter (computing)|interpreter]] overhead (particularly when composing many small operations), poorer scalability, and reduced benefit from compiler optimization.<ref name="myia1" /><ref name="pytorchtut" /> A package for the [[Julia (programming language)|Julia]] programming language{{snd}} [https://github.com/FluxML/Zygote.jl Zygote]{{snd}} works directly on Julia's [[intermediate representation]], allowing it to still be [[compiler optimization|optimized]] by Julia's just-in-time compiler.<ref name="flux" /><ref>{{cite arXiv|last=Innes|first=Michael|date=2018-10-18|title=Don't Unroll Adjoint: Differentiating SSA-Form Programs|eprint=1810.07951|class=cs.PL}}</ref><ref name="diffprog-zygote" />
 
A limitation of earlier approaches is that they are only able to differentiate code written in a suitable manner for the framework, limiting their interoperability with other programs. Newer approaches resolve this issue by constructing the graph from the language's syntax or IR, allowing arbitrary code to be differentiated.<ref name="flux" /><ref name="myia1" />