Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Physical Symbolic Optimization (2312.03612v1)

Published 6 Dec 2023 in cs.LG, astro-ph.IM, cs.SC, physics.comp-ph, and physics.data-an

Abstract: We present a framework for constraining the automatic sequential generation of equations to obey the rules of dimensional analysis by construction. Combining this approach with reinforcement learning, we built $\Phi$-SO, a Physical Symbolic Optimization method for recovering analytical functions from physical data leveraging units constraints. Our symbolic regression algorithm achieves state-of-the-art results in contexts in which variables and constants have known physical units, outperforming all other methods on SRBench's Feynman benchmark in the presence of noise (exceeding 0.1%) and showing resilience even in the presence of significant (10%) levels of noise.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (16)
  1. Neural symbolic regression that scales. In Marina Meila and Tong Zhang, editors, Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, pages 936–945. PMLR, 18–24 Jul 2021.
  2. Edgar Buckingham. On physically similar systems; illustrations of the use of dimensional equations. Physical review, 4(4):345, 1914.
  3. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
  4. End-to-end symbolic regression with transformers. In Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho, editors, Advances in Neural Information Processing Systems, 2022.
  5. Contemporary symbolic regression methods and their relative performance. In J. Vanschoren and S. Yeung, editors, Proceedings of the Neural Information Processing Systems Track on Datasets and Benchmarks, volume 1. Curran, 2021.
  6. A unified framework for deep symbolic regression. Advances in Neural Information Processing Systems, 35:33985–33998, 2022.
  7. Improving exploration in policy gradient search: Application to symbolic optimization. In 1st Mathematical Reasoning in General Artificial Intelligence, International Conference on Learning Representations (ICLR), 2021.
  8. Analytical modeling of exoplanet transit spectroscopy with dimensional analysis and symbolic regression. The Astrophysical Journal, 930(1):33, 2022.
  9. Isaac Newton. Philosophiae naturalis principia mathematica, volume 1. G. Brookman, 1833.
  10. Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients. In International Conference on Learning Representations, 2021.
  11. Distilling free-form natural laws from experimental data. science, 324(5923):81–85, 2009.
  12. Age-Fitness Pareto Optimization, pages 129–146. Springer New York, New York, NY, 2011.
  13. Ai feynman 2.0: Pareto-optimal symbolic regression exploiting graph modularity. Advances in Neural Information Processing Systems, 33:4860–4871, 2020.
  14. Ai feynman: A physics-inspired method for symbolic regression. Science Advances, 6(16):eaay2631, 2020.
  15. Symbolic regression is NP-hard. Transactions on Machine Learning Research, 2022.
  16. Algorithm 778: L-bfgs-b: Fortran subroutines for large-scale bound-constrained optimization. ACM Transactions on mathematical software (TOMS), 23(4):550–560, 1997.

Summary

We haven't generated a summary for this paper yet.