Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Statistical learnability of nuclear masses (1904.00057v3)

Published 29 Mar 2019 in nucl-th, cond-mat.dis-nn, and cs.LG

Abstract: After more than 80 years from the seminal work of Weizs\"acker and the liquid drop model of the atomic nucleus, deviations from experiments of mass models ($\sim$ MeV) are orders of magnitude larger than experimental errors ($\lesssim$ keV). Predicting the mass of atomic nuclei with precision is extremely challenging. This is due to the non--trivial many--body interplay of protons and neutrons in nuclei, and the complex nature of the nuclear strong force. Statistical theory of learning will be used to provide bounds to the prediction errors of model trained with a finite data set. These bounds are validated with neural network calculations, and compared with state of the art mass models. Therefore, it will be argued that the nuclear structure models investigating ground state properties explore a system on the limit of the knowledgeable, as defined by the statistical theory of learning.

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com