2000 character limit reached
Optimistix: modular optimisation in JAX and Equinox (2402.09983v1)
Published 15 Feb 2024 in math.OC and cs.MS
Abstract: We introduce Optimistix: a nonlinear optimisation library built in JAX and Equinox. Optimistix introduces a novel, modular approach for its minimisers and least-squares solvers. This modularity relies on new practical abstractions for optimisation which we call search and descent, and which generalise classical notions of line search, trust-region, and learning-rate algorithms. It provides high-level APIs and solvers for minimisation, nonlinear least-squares, root-finding, and fixed-point iteration. Optimistix is available at https://github.com/patrick-kidger/optimistix.
- Super-efficiency of automatic differentiation for functions defined as a minimum, 2020.
- A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems. Journal of Global Optimization, 2005.
- Andrei, N. An unconstrained optimization test functions collection. In Advanced Modeling and Optimization, Volume 10, 2008. URL https://api.semanticscholar.org/CorpusID:63504217.
- The minpack test problem collection. Technical report, Office of Energy Research, US Department of Energy, 1992.
- The deepmind jax ecosystem, 2020. URL http://github.com/deepmind.
- Best practices for comparing optimization algorithms. Optimization and Engineering, 18(4):815–848, 2017.
- Jax-fluids: A fully-differentiable high-order computational fluid dynamics solver for compressible two-phase flows. Computer Physics Communications, pp. 108527, 2022.
- Efficient and modular implicit differentiation. arXiv preprint arXiv:2105.15183, 2021.
- Numerical Optimization: Theoretical and Practical Aspects. Springer Berlin Heidelberg, 2006.
- Jax: composable transformations of python+numpy programs, 2018. URL http://github.com/google/jax.
- Carroll, C. Bayeux. Accessed 2024, 2023. URL https://github.com/jax-ml/bayeux.
- Trust Region Methods. Society for Industrial and Applied Mathematics, 2000.
- Benchmarking optimization software with performance profiles. CoRR, cs.MS/0102001, 2001.
- Learning to correct spectral methods for simulating turbulent flows. arXiv, 2022. doi: 10.48550/ARXIV.2207.00556. URL https://arxiv.org/abs/2207.00556.
- Adaptive subgradient methods for online learning and stochastic optimization. Journal of Machine Learning Research, 12:2121–2159, 2011.
- Brax - a differentiable physics engine for large scale rigid body simulation, 2021. URL http://github.com/google/brax.
- Google vizier: A service for black-box optimization. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, August 13 - 17, 2017, pp. 1487–1495. ACM, 2017. URL https://doi.org/10.1145/3097983.3098043.
- Shampoo: Preconditioned stochastic tensor optimization, 2018.
- Solving Ordinary Differential Equations II Stiff and D ifferential-Algebraic Problems. Springer, Berlin, second revised edition edition, 2002.
- Solving Ordinary Differential Equations I Nonstiff P roblems. Springer, Berlin, second revised edition edition, 2008.
- Haliax. Accessed 2023, 2023a. URL https://github.com/stanford-crfm/haliax.
- Levanter — legible, scalable, reproducible foundation models with jax. Accessed 2023, 2023b. URL https://github.com/stanford-crfm/levanter.
- A literature survey of benchmark functions for global optimisation problems. International Journal of Mathematical Modelling and Numerical Optimisation, 4(2):150, 2013.
- Highly accurate protein structure prediction with AlphaFold. Nature, 596(7873):583–589, 2021. doi: 10.1038/s41586-021-03819-2.
- Kidger, P. On Neural Differential Equations. PhD thesis, University of Oxford, 2021.
- Equinox: neural networks in jax via callable pytrees and filtered transformations. Differentiable Programming workshop at Neural Information Processing Systems 2021, 2021.
- Adam: A method for stochastic optimization, 2017.
- Moré, J. J. The levenberg-marquardt algorithm: Implementation and theory. In Numerical Analysis, pp. 105–116, Berlin, Heidelberg, 1978. Springer Berlin Heidelberg.
- Testing unconstrained optimistation software. Technical report, US Department of Energy, 1981.
- Line search algorithms with guaranteed sufficient decrease. ACM Transactions on Mathematical Software, 20:286–307, 1994. ISSN 0098-3500.
- User guide for minpack-1. Technical report, Argonne National Lab. (ANL), 1980.
- Numerical Optimization (Second Edition). Springer New York, 2006.
- Pastrana, R. flowmc. Accessed 2023, 2023. URL https://github.com/kazewong/flowMC.
- JAX FDM: A differentiable solver for inverse form-finding. In ICML 2023 Workshop on Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators, 2023. URL https://openreview.net/forum?id=Uu9OPgh24d.
- Lineax: unified linear solves and linear least-squares in jax and equinox. AI for science workshop at Neural Information Processing Systems 2023, arXiv:2311.17283, 2023.
- Rosenbrock, H. H. An Automatic Method for Finding the Greatest or Least Value of a Function. The Computer Journal, 3(3):175–184, 01 1960.
- SAS Institute Inc. SAS/IML 9.1 User’s Guide. SAS Institute Inc, 2004.
- Conjugate gradient methods using quasi-newton updates with inexact line searches. Journal of Mathematical Analysis and Applications, 150:359–377, 8 1990. ISSN 0022247X.
- Shewchuk, J. R. An introduction to the conjugate gradient method without the agonizing pain. Technical report, Carnegie Mellon University, USA, 1994.
- Singh, A. Eqxvision. Accessed 2023, 2022. URL https://github.com/paganpasta/eqxvision.
- Open source vizier: Distributed infrastructure and api for reliable and flexible black-box optimization. In Automated Machine Learning Conference, Systems Track (AutoML-Conf Systems), 2022.
- SynJax: Structured Probability Distributions for JAX. arXiv preprint arXiv:2308.03291, 2023.
- Steihaug, T. The conjugate gradient method and trust regions in large scale optimization. SIAM Journal on Numerical Analysis, 20(3):626–637, 1983.
- New algorithms for optimal online checkpointing. SIAM Journal on Scientific Computing, 32(2):836–854, 2010. doi: 10.1137/080742439.
- SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods, 17:261–272, 2020.
- Wang, P. Palm - jax. Accessed 2023, 2023. URL https://github.com/lucidrains/PaLM-jax.
- Minimal repetition dynamic checkpointing algorithm for unsteady adjoint calculation. SIAM Journal on Scientific Computing, 31(4):2549–2567, 2009. doi: 10.1137/080727890.