Differentiable programming: Difference between revisions

Content deleted Content added
Rephrase
No edit summary
Line 7:
* ''' Static [[compiled]] graph based''' approaches such as [[TensorFlow]], [[Theano]], and [[MXNet]]. They tend to allow for good compiler optimization and easier scaling to large systems, but their static nature limits interactivity and the types of programs that can be created easily (e.g. those involving loops or recursion), as well as making it harder for users to reason effectively about their programs.<ref name="flux" /><ref name="myia1">{{Cite web|url=https://github.com/mila-iqia/myia/blob/master/README.rst|title=Myia|access-date=2019-03-04}}</ref>
 
* '''Operator overloading (dynamic graph) based''' approaches such as [[PyTorch]] and [[AutoGrad (NumPy)|AutoGrad]]. Their dynamic and interactive nature lets most programs be written and reasoned about more easily. However, they lead to interpreter overhead (particularly when composing many small operations), poorer scalability, and cannot gain much benefit from compiler optimization.<ref name="myia1" />
 
Both of these earlier attempts are also generally only able to differentiate code written in a suitable manner for the framework, limiting their interoperability with other programs.