AutoNumerics-Zero: Automated Discovery of State-of-the-Art Mathematical Functions (2312.08472v1)
Abstract: Computers calculate transcendental functions by approximating them through the composition of a few limited-precision instructions. For example, an exponential can be calculated with a Taylor series. These approximation methods were developed over the centuries by mathematicians, who emphasized the attainability of arbitrary precision. Computers, however, operate on few limited precision types, such as the popular float32. In this study, we show that when aiming for limited precision, existing approximation methods can be outperformed by programs automatically discovered from scratch by a simple evolutionary algorithm. In particular, over real numbers, our method can approximate the exponential function reaching orders of magnitude more precision for a given number of operations when compared to previous approaches. More practically, over float32 numbers and constrained to less than 1 ULP of error, the same method attains a speedup over baselines by generating code that triggers better XLA/LLVM compilation paths. In other words, in both cases, evolution searched a vast space of possible programs, without knowledge of mathematics, to discover previously unknown optimized approximations to high precision, for the first time. We also give evidence that these results extend beyond the exponential. The ubiquity of transcendental functions suggests that our method has the potential to reduce the cost of scientific computing applications.
- The XLA Authors. Openxla/xla. URL https://github.com/openxla/xla/blob/7a7cee6e31a01d0103c41b753c7e7fe6e0eeece8/xla/service/cpu/parallel_task_assignment.cc#L77 , 2023.
- The XLA Authors. Openxla/xla. URL https://github.com/openxla/xla/blob/7a7cee6e31a01d0103c41b753c7e7fe6e0eeece8/xla/service/cpu/cpu_instruction_fusion.cc#L71 , 2023.
- Automatic generation of peephole superoptimizers. ACM SIGARCH Computer Architecture News, 2006.
- Some considerations on the reason for bloat. Genetic Programming and Evolvable Machines, 2002.
- Genetic programming: an introduction: on the automatic evolution of computer programs and its applications. Morgan Kaufmann Publishers Inc., 1998.
- Advanced mathematical methods for scientists and engineers I: Asymptotic methods and perturbation theory. Springer Science & Business Media, 1999.
- Neural symbolic regression that scales. In ICML, 2021.
- Efficient polynomial l-approximations. In ARITH, 2007.
- Discovering governing equations from data by sparse identification of nonlinear dynamical systems. PNAS, 2016.
- Symbolic discovery of optimization algorithms. arXiv, 2023.
- Sollya: An environment for the development of numerical codes. In Mathematical Software - ICMS 2010. Springer, 2010.
- A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE transactions on evolutionary computation, 2002.
- Leonhard Euler. Introductio in analysin infinitorum, volume 2. Apud Marcum-Michaelem Bousquet & Socios, 1748.
- Discovering faster matrix multiplication algorithms with reinforcement learning. Nature, 2022.
- Intelligent decision making through a simulation of evolution. Behavioral science, 1966.
- Compiling machine learning programs via high-level tracing. Systems for Machine Learning, 2018.
- Carl Friedrich Gauss. Disquisitiones generales circa seriem infinitam. 1813.
- A comparative analysis of selection schemes used in genetic algorithms. FOGA, 1991.
- Completely derandomized self-adaptation in evolution strategies. Evolutionary computation, 2001.
- John H Holland. Adaptation in natural and artificial systems. MIT press, 1992.
- Automated machine learning: methods, systems, challenges. Springer Nature, 2019.
- Wolfram Research, Inc. Mathematica, Version 12.0.0.0. Champaign, IL, 2023.
- Denali: A goal-directed superoptimizer. ACM SIGPLAN Notices, 2002.
- Discovering adaptable symbolic algorithms from scratch. IROS, 2023.
- Parameter identification for symbolic regression using nonlinear least squares. Genetic Programming and Evolvable Machines, 2020.
- Effects of constant optimization by nonlinear least squares minimization in symbolic regression. In GECCO, 2013.
- Genetic programming: on the programming of computers by means of natural selection. MIT press, 1992.
- Grammar variational autoencoder. In ICML, 2017.
- Automatic identification of wind turbine models using evolutionary multiobjective optimization. Renewable Energy, 2016.
- Contemporary symbolic regression methods and their relative performance. arXiv, 2021.
- Darts: Differentiable architecture search. ICLR, 2019.
- Pde-net: Learning pdes from data. In ICML, 2018.
- Cma-es for hyperparameter optimization of deep neural networks. arXiv, 2016.
- A comparison of bloat control methods for genetic programming. Evolutionary computation, 2006.
- Nathaniel Macon. On the computation of exponential and hyperbolic functions using continued fractions. Journal of the ACM (JACM), 2(4):262–266, 1955.
- Faster sorting algorithms discovered using deep reinforcement learning. Nature, 2023.
- Jordan Ninin. Global optimization based on contractor programming: An overview of the ibex library. In Mathematical Aspects of Computer and Information Sciences: 6th International Conference, MACIS 2015, Berlin, Germany, November 11-13, 2015, Revised Selected Papers 6, pages 555–559. Springer, 2016.
- Where are we now? a large benchmark study of recent symbolic regression methods. In GECCO, 2018.
- Alexander M Ostrowski. On two problems in abstract algebra connected with horner’s rule. In Studies in Mathematics and Mechanics presented to Richard von Mises. Academic Press San Diego, 1954.
- V Ya Pan. Methods of computing values of polynomials. Russian Mathematical Surveys, 1966.
- Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients. arXiv, 2019.
- Efficient neural architecture search via parameter sharing. ICML, 2018.
- Regularized evolution for image classifier architecture search. AAAI, 2019.
- Automl-zero: Evolving machine learning algorithms from scratch. In ICML, 2020.
- E Ya Remez. Foundations of numerical methods of chebyshev approximation. Kiev, Naukova dumka, 1969.
- Data-driven discovery of partial differential equations. Science advances, 2017.
- Amit Sabne. Xla: Compiling machine learning for peak performance. 2020.
- Learning equations for extrapolation and control. In ICML, 2018.
- Hayden Schaeffer. Learning partial differential equations via data discovery and sparse optimization. Proceedings of the Royal Society A, 2017.
- Stochastic superoptimization. ACM SIGARCH Computer Architecture News, 2013.
- Stochastic optimization of floating-point programs with tunable precision. ACM SIGPLAN Notices, 2014.
- Distilling free-form natural laws from experimental data. Science, 2009.
- Neil JA Sloane et al. The on-line encyclopedia of integer sequences, 2003.
- Pareto-front exploitation in symbolic regression. 2005.
- Advances in genetic programming: volume 3. MIT Press, 1999.
- Modeling global temperature changes with genetic programming. Computers & Mathematics with Applications, 2012.
- Vasp hits the memory wall: Processors efficiency comparison. Concurrency and Computation: Practice and Experience, 31(19):e5136, 2019.
- Richard Sutton. The bitter lesson. Incomplete Ideas (blog), 2019.
- P Tchébychev. Sur les questions de minima qui se rattechent a la raprésentation aproximative des fonctions. Imprimerie de l’Academie Impériale des Sciences, 1858.
- The XLA team. Xla - tensorflow, compiled, 2017. [Online; accessed 13-December-2023].
- Mujoco: A physics engine for model-based control. In IROS, 2012.
- Warwick Tucker. Validated numerics: a short introduction to rigorous computations. Princeton University Press, 2011.
- Ai feynman: A physics-inspired method for symbolic regression. Science Advances, 2020.
- Semantically-based crossover in genetic programming: application to real-valued symbolic regression. Genetic Programming and Evolvable Machines, 2011.
- Arthur F Voter. Introduction to the kinetic monte carlo method. In Radiation effects in solids. Springer, 2007.
- Symbolic regression in materials science. MRS Communications, 2019.
- Wikipedia contributors. Unit in the last place — Wikipedia, the free encyclopedia, 2023. [Online; accessed 26-July-2023].
- Spea2: Improving the strength pareto evolutionary algorithm. 2001.
- Ieee standard for floating-point arithmetic. IEEE Std, 754, 2008.