Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Automated Discovery of Integral with Deep Learning (2402.18040v1)

Published 28 Feb 2024 in cs.AI and cs.LG

Abstract: Recent advancements in the realm of deep learning, particularly in the development of LLMs, have demonstrated AI's ability to tackle complex mathematical problems or solving programming challenges. However, the capability to solve well-defined problems based on extensive training data differs significantly from the nuanced process of making scientific discoveries. Trained on almost all human knowledge available, today's sophisticated LLMs basically learn to predict sequences of tokens. They generate mathematical derivations and write code in a similar way as writing an essay, and do not have the ability to pioneer scientific discoveries in the manner a human scientist would do. In this study we delve into the potential of using deep learning to rediscover a fundamental mathematical concept: integrals. By defining integrals as area under the curve, we illustrate how AI can deduce the integral of a given function, exemplified by inferring $\int_{0}{x} t2 dt = \frac{x3}{3}$ and $\int_{0}{x} ae{bt} dt = \frac{a}{b} e{bx} - \frac{a}{b}$. Our experiments show that deep learning models can approach the task of inferring integrals either through a sequence-to-sequence model, akin to language translation, or by uncovering the rudimentary principles of integration, such as $\int_{0}{x} tn dt = \frac{x{n+1}}{n+1}$.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (23)
  1. Microsoft Copilot. https://copilot.microsoft.com/ (2023) Rozière et al. [2023] Rozière, B., et al.: Code llama: Open foundation models for code. (2023) arXiv:2308.12950 Huang et al. [2023] Huang, Y., et al.: Competition-level problems are effective llm evaluators. (2023) arXiv:2312.02143 Azerbayev et al. [2023] Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Rozière, B., et al.: Code llama: Open foundation models for code. (2023) arXiv:2308.12950 Huang et al. [2023] Huang, Y., et al.: Competition-level problems are effective llm evaluators. (2023) arXiv:2312.02143 Azerbayev et al. [2023] Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Huang, Y., et al.: Competition-level problems are effective llm evaluators. (2023) arXiv:2312.02143 Azerbayev et al. [2023] Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  2. Rozière, B., et al.: Code llama: Open foundation models for code. (2023) arXiv:2308.12950 Huang et al. [2023] Huang, Y., et al.: Competition-level problems are effective llm evaluators. (2023) arXiv:2312.02143 Azerbayev et al. [2023] Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Huang, Y., et al.: Competition-level problems are effective llm evaluators. (2023) arXiv:2312.02143 Azerbayev et al. [2023] Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  3. Huang, Y., et al.: Competition-level problems are effective llm evaluators. (2023) arXiv:2312.02143 Azerbayev et al. [2023] Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  4. Azerbayev, Z., et al.: Llemma: An open language model for mathematics. (2023) arXiv:2310.10631 Trinh et al. [2024] Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  5. Trinh, T.H., et al.: Solving olympiad geometry without human demonstrations. Nature 625, 476–482 (2024) d’Ascoli et al. [2022] d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  6. d’Ascoli, S., Kamienny, P.-A., Lample, G., Charton, F.: Deep symbolic regression for recurrent sequences. In: Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA (2022) Waltz and Buchanan [2009] Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  7. Waltz, D., Buchanan, B.G.: Automating science. Science 324(5923), 43–44 (2009) King et al. [2009] King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  8. King, R.D., et al.: The robot scientist adam. Computer 42(8), 46–54 (2009) Naik et al. [2016] Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  9. Naik, A.W., et al.: Active machine learning-driven experimentation to determine compound effects on protein patterns. eLife 5(e10047) (2016) Lample and Charton [2019] Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  10. Lample, G., Charton, F.: Deep learning for symbolic mathematics (2019) arXiv:1912.01412 Vaswani et al. [2017] Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  11. Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st Conference on Advances in Neural Information Processing Systems, Long Beach, California, USA (2017) Hendrycks et al. [2021] Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  12. Hendrycks, D., et al.: Measuring mathematical problem solving with the math dataset (2021) arXiv:2103.03874 Meurer et al. [2017] Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  13. Meurer, A., et al.: Sympy: symbolic computing in python. PeerJ Computer Science 3, 103 (2017) Augusto and Barbosa [2000] Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  14. Augusto, D.A., Barbosa, H.J.: Symbolic regression via genetic programming. In: Proceedings Vol. 1. Sixth Brazilian Symposium on Neural Networks (2000) Schmidt and Lipson [2009] Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  15. Schmidt, M., Lipson, H.: Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009) Murari et al. [2014] Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  16. Murari, A., Peluso, E., Gelfusa, M., Lupelli, I., Lungaroni, M., Gaudio, P.: Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form. Plasma Physics and Controlled Fusion 57(1) (2014) McKay et al. [1995] McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  17. McKay, B., Willis, M.J., Barton, G.W.: Using a tree structured genetic algorithm to perform symbolic regression. In: Proceedings of First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications (1995) Sahoo et al. [2018] Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  18. Sahoo, S., Lampert, C., Martius, G.: Learning equations for extrapolation and control. In: Proceedings of International Conference on Machine Learning (2018) Kim et al. [2020] Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  19. Kim, S., et al.: Integration of neural network-based symbolic regression in deep learning for scientific discovery. IEEE Transactions on Neural Networks and Learning Systems 32(9) (2020) Petersen et al. [2019] Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  20. Petersen, B.K., et al.: Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients (2019) arXiv:1912.04871 Landajuela et al. [2022] Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  21. Landajuela, M., et al.: A unified framework for deep symbolic regression. In: Advances in Neural Information Processing Systems (2022) Black et al. [2021] Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  22. Black, S., Gao, L., Wang, P., Leahy, C., Biderman, S.: GPT-Neo: Large Scale Autoregressive Language Modeling with Mesh-Tensorflow. https://doi.org/10.5281/zenodo.5297715 Chung et al. [2022] Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416 Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
  23. Chung, H.W., et al.: Scaling instruction-finetuned language models (2022) arXiv:2210.11416
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Xiaoxin Yin (4 papers)

Summary

We haven't generated a summary for this paper yet.