- The paper presents an unbiased determination of parton distributions by leveraging neural networks and Monte Carlo techniques at NNLO and LO.
- It benchmarks the Fixed Order plus Next-to-Leading Logarithms (FONLL) method for heavy quark mass effects while rigorously validating predictions with LHC experimental data.
- The study highlights the convergence across perturbative orders and the sensitivity of PDFs to variations in αs and quark masses, underpinning high-precision collider phenomenology.
Unbiased Determination of Parton Distributions at NNLO and LO
This essay discusses the comprehensive paper undertaken by the NNPDF Collaboration on the unbiased global determination of parton distribution functions (PDFs) at next-to-next-to-leading order (NNLO) and leading order (LO) in quantum chromodynamics (QCD). The research extends the established NNPDF2.1 NLO set to both LO and the higher precision NNLO order, with a key focus on minimizing parametrization bias and improving upon statistical methodologies.
The paper emphasizes the incorporation of heavy quark masses using the Fixed Order plus Next-to-Leading Logarithms (FONLL) method. It benchmarks FONLL at NNLO and assesses the stability of PDFs upon inclusion of NNLO corrections. The work investigates the convergence of the perturbative expansion across LO, Next-to-Leading Order (NLO), and NNLO levels. Importantly, it robustly tests predictions against experimental data from the Large Hadron Collider (LHC), showcasing the implications of NNLO corrections on collider phenomenology.
A significant component of this paper is the examination of PDFs determined with varying values of the strong coupling constant, αs, and masses for charm (mc) and bottom (mb) quarks. It also explores PDF determinations based on varying subsets of experimental data, highlighting the balance between dataset size and phenomenological accuracy.
Critical Methodological Aspects
The NNPDF methodology distinguishes itself through its reliance on neural networks to mitigate biases in PDF parametrization. This stochastic approach is underpinned by Monte Carlo techniques and aims to achieve statistical robustness. The research validates this approach by analyzing the momentum sum rule's compliance at successive perturbative orders and demonstrates convergence in the perturbative expansion of PDFs.
FastKernel technology plays a pivotal role in efficiently implementing NNLO computation across deep inelastic scattering (DIS) and Drell-Yan (DY) observables without conventional K-factors. Furthermore, implementation challenges at NNLO, particularly related to Mellin transforms of coefficient functions and heavy quark contributions, are addressed meticulously.
Implications and Future Directions
The paper provides an empirical basis for understanding NNLO phenomenology's compatibility with precise LHC measurements, such as Higgs production which exhibits substantial NNLO QCD corrections. It also delineates the critical role of NNLO PDFs in high-precision predictions that are increasingly demanded by current collider experiments.
Investigation of different αs values highlights the sensitivity of PDFs to theoretical inputs, indicating areas for refinement in parameter determinations. The authors contemplate the theoretical implications, suggesting avenues for including theoretical uncertainties in PDF determinations, especially pertinent at NNLO and beyond.
In conclusion, this research enriches the theoretical landscape of parton distribution studies using an evidence-based, bias-minimized approach. It lays the groundwork for further theoretical refinements and precise experimental confrontations at future colliders. As the field progresses, integrating uncertainties due to theoretical assumptions and higher-order corrections will be pivotal in realizing comprehensive and reliable predictions critical to uncovering new physics phenomena.