Evaluating the faithfulness of PDF uncertainties in the presence of inconsistent data (2503.17447v1)
Abstract: We critically assess the robustness of uncertainties on parton distribution functions (PDFs) determined using neural networks from global sets of experimental data collected from multiple experiments. We view the determination of PDFs as an inverse problem, and we study the way the neural network model tackles it when inconsistencies between input datasets are present. We use a closure test approach, in which the regression model is applied to artificial data produced from a known underlying truth, to which the output of the model can be compared and its accuracy can be assessed in a statistically reliable way. We explore various phenomenologically relevant scenarios in which inconsistencies arise due to incorrect estimation of correlated systematic uncertainties. We show that the neural network generally corrects for the inconsistency except in cases of extreme uncertainty underestimation. When the inconsistency is not corrected, we propose and validate a procedure to detect inconsistencies.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.