Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Automatic Functional Differentiation in JAX (2311.18727v2)

Published 30 Nov 2023 in cs.PL, cs.CL, and cs.LG

Abstract: We extend JAX with the capability to automatically differentiate higher-order functions (functionals and operators). By representing functions as a generalization of arrays, we seamlessly use JAX's existing primitive system to implement higher-order functions. We present a set of primitive operators that serve as foundational building blocks for constructing several key types of functionals. For every introduced primitive operator, we derive and implement both linearization and transposition rules, aligning with JAX's internal protocols for forward and reverse mode automatic differentiation. This enhancement allows for functional differentiation in the same syntax traditionally use for functions. The resulting functional gradients are themselves functions ready to be invoked in python. We showcase this tool's efficacy and simplicity through applications where functional derivatives are indispensable. The source code of this work is released at https://github.com/sail-sg/autofd .

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. {{\{{TensorFlow}}\}}: a system for {{\{{Large-Scale}}\}} machine learning. In 12th USENIX symposium on operating systems design and implementation (OSDI 16), pp.  265–283, 2016.
  2. Alampallam V Balakrishnan. Applied Functional Analysis: A, volume 3. Springer Science & Business Media, 2012.
  3. Theano: new features and speed improvements. arXiv preprint arXiv:1211.5590, 2012.
  4. Taylor-mode automatic differentiation for higher-order derivatives in jax. In Program Transformations for ML Workshop at NeurIPS 2019, 2019.
  5. Automatic fréchet differentiation for the numerical solution of boundary-value problems. ACM Transactions on Mathematical Software (TOMS), 38(4):1–29, 2012.
  6. Mxnet: A flexible and efficient machine learning library for heterogeneous distributed systems. arXiv preprint arXiv:1512.01274, 2015.
  7. Lagrangian neural networks. arXiv preprint arXiv:2003.04630, 2020.
  8. A language for evaluating derivatives of functionals using automatic differentiation. arXiv preprint arXiv:2210.06095, 2022.
  9. Leon Ehrenpreis. On the theory of kernels of schwartz. Proceedings of the American Mathematical Society, 7(4):713–718, 1956.
  10. Conal Elliott. The simple essence of automatic differentiation. Proceedings of the ACM on Programming Languages, 2(ICFP):1–29, 2018.
  11. Brax - a differentiable physics engine for large scale rigid body simulation, 2021. URL http://github.com/google/brax.
  12. Compiling machine learning programs via high-level tracing. Systems for Machine Learning, 4(9), 2018.
  13. A Gonis. Functionals and functional derivatives of wave functions and densities. World Journal of Condensed Matter Physics, 2014, 2014.
  14. Neural integral functionals. In ICLR 2023 Workshop on Physics for Machine Learning, 2023.
  15. Correctness of automatic differentiation via diffeologies and categorical gluing. In FoSSaCS, pp.  319–338, 2020.
  16. Patrick Kidger. sympy2jax. https://github.com/google/sympy2jax.
  17. Recent developments in libxc—a comprehensive library of functionals for density functional theory. SoftwareX, 7:1–5, 2018.
  18. Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895, 2020.
  19. Maplesoft. functional differentiation. URL https://www.maplesoft.com/support/help/maple/view.aspx?path=Physics%2FFundiff.
  20. Pure non-local machine-learned density functional theory for electron correlation. Nature communications, 12(1):344, 2021.
  21. Sympy: symbolic computing in python. PeerJ Computer Science, 3:e103, January 2017. ISSN 2376-5992. doi: 10.7717/peerj-cs.103. URL https://doi.org/10.7717/peerj-cs.103.
  22. Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems, 32, 2019.
  23. Reverse-mode ad in a functional framework: Lambda the ultimate backpropagator. ACM Transactions on Programming Languages and Systems (TOPLAS), 30(2):1–36, 2008.
  24. Jacob’s ladder of density functional approximations for the exchange-correlation energy. In AIP Conference Proceedings, volume 577, pp.  1–20. American Institute of Physics, 2001.
  25. Cosmopower-jax: high-dimensional bayesian inference with differentiable cosmological emulators. arXiv preprint arXiv:2305.06347, 2023.
  26. You only linearize once: Tangents transpose to gradients. Proceedings of the ACM on Programming Languages, 7(POPL):1246–1274, 2023.
  27. Jax, md a framework for differentiable physics. Journal of Statistical Mechanics: Theory and Experiment, 2021(12):124016, 2021.
  28. Efficient differentiable programming in a functional array-processing language. Proceedings of the ACM on Programming Languages, 3(ICFP):1–30, 2019.
  29. Computable semantics for differentiable programming with higher-order functions and datatypes. Proceedings of the ACM on Programming Languages, 5(POPL):1–31, 2021.
  30. Shlomo Sternberg. Lectures on differential geometry, volume 316. American Mathematical Soc., 1999.
  31. Demystifying differentiable programming: Shift/reset the penultimate backpropagator. Proceedings of the ACM on Programming Languages, 3(ICFP):1–31, 2019.
  32. Inc. Wolfram Research. Variationald. URL https://reference.wolfram.com/language/VariationalMethods/tutorial/VariationalMethods.html.
  33. Functional derivatives of meta-generalized gradient approximation (meta-gga) type exchange-correlation density functionals. The Journal of Chemical Physics, 138(24), 2013.
  34. Jax-xc: Exchange correlation functionals library in jax. In Workshop on”Machine Learning for Materials”ICLR 2023, 2023.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com
X Twitter Logo Streamline Icon: https://streamlinehq.com

HackerNews