Nuclear mass predictions using machine learning models (2401.02824v2)
Abstract: The exploration of nuclear mass or binding energy, a fundamental property of atomic nuclei, remains at the forefront of nuclear physics research due to limitations in experimental studies and uncertainties in model calculations, particularly when moving away from the stability line. In this work, we employ two ML models, Support Vector Regression (SVR) and Gaussian Process Regression (GPR), to assess their performance in predicting nuclear mass excesses using available experimental data and a physics-based feature space. We also examine the extrapolation capabilities of these models using newly measured nuclei from AME2020 and by extending our calculations beyond the training and test set regions. Our results indicate that both SVR and GPR models perform quite well within the training and test regions when informed with a physics-based feature space. Furthermore, these ML models demonstrate the ability to make reasonable predictions away from the available experimental data, offering results comparable to the model calculations. Through further refinement, these models can be used as reliable and efficient ML tools for studying nuclear properties in the future.
- A. V. Afanasjev and S. E. Agbemava, Phys. Rev. C 93, 054310 (2016).
- S. Goriely, N. Chamel, and J. M. Pearson, Phys. Rev. C 88, 024308 (2013a).
- R. Utama and J. Piekarewicz, Phys. Rev. C 96, 044308 (2017).
- S. Goriely, N. Chamel, and J. M. Pearson, Phys. Rev. C 88, 061302 (2013b).
- S. Goriely, N. Chamel, and J. M. Pearson, Phys. Rev. C 93, 034337 (2016).
- S. Gazula, J. Clark, and H. Bohr, Nuclear Physics A 540, 1 (1992).
- J. W. Clark and H. Li, International Journal of Modern Physics B 20, 5015–5029 (2006).
- E. Yüksel, D. Soydaner, and H. Bahtiyar, International Journal of Modern Physics E 30, 2150017 (2021).
- H. Bahtiyar, D. Soydaner, and E. Yüksel, Applied Soft Computing 128, 109470 (2022).
- M. Shelley and A. Pastore, Universe 7, 10.3390/universe7050131 (2021).
- R. Utama and J. Piekarewicz, Phys. Rev. C 97, 014306 (2018).
- Z. Niu and H. Liang, Physics Letters B 778, 48 (2018).
- B. Boser, I. Guyon, and V. Vapnik, Proceedings of the Fifth Annual Workshop on Computational Learning Theory , 144 (1992).
- A. Géron, Hands-On Machine Learning with Scikit-Learn & Tensorflow (O’Reilly Media, Inc., 2017) pp. 154–156.
- D. Soydaner and J. Wagemans, arXiv preprint arXiv: 2311.14410 (2023).
- D. MacKay, Introduction to Gaussian Processes, Neural networks and machine learning (Springer, 1998) pp. 133–166.
- E. Alpaydın, Introduction to Machine Learning (MIT Press, 2014) pp. 474–478.
- C. Rasmussen and C. Williams, Gaussian Processes for Machine Learning (MIT Press, 2006).
- R. F. Casten and N. V. Zamfir, Journal of Physics G: Nuclear and Particle Physics 22, 1521 (1996).
- K. Vogt, T. Hartmann, and A. Zilges, Physics Letters B 517, 255 (2001).
- E. Yüksel, T. Marketin, and N. Paar, Phys. Rev. C 99, 034318 (2019).
- S. Lundberg and S. Lee, Advances in Neural Information Processing Systems 30, 061302 (2017).
- L. Shapley, Contributions to the Theory of Games , 307–317 (1953).
- E. Winter, Handbook of Game Theory with Economic Applications 3, 2025–2054 (2002).
- Https://shap-lrjball.readthedocs.io/en/latest/examples.html.