Papers
Topics
Authors
Recent
AI Research Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 85 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 16 tok/s Pro
GPT-5 High 10 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 455 tok/s Pro
Claude Sonnet 4 31 tok/s Pro
2000 character limit reached

Parton distributions for the LHC Run II (1410.8849v4)

Published 31 Oct 2014 in hep-ph and hep-ex

Abstract: We present NNPDF3.0, the first set of parton distribution functions (PDFs) determined with a methodology validated by a closure test. NNPDF3.0 uses a global dataset including HERA-II deep-inelastic inclusive cross-sections, the combined HERA charm data, jet production from ATLAS and CMS, vector boson rapidity and transverse momentum distributions from ATLAS, CMS and LHCb, W+c data from CMS and top quark pair production total cross sections from ATLAS and CMS. Results are based on LO, NLO and NNLO QCD theory and also include electroweak corrections. To validate our methodology, we show that PDFs determined from pseudo-data generated from a known underlying law correctly reproduce the statistical distributions expected on the basis of the assumed experimental uncertainties. This closure test ensures that our methodological uncertainties are negligible in comparison to the generic theoretical and experimental uncertainties of PDF determination. This enables us to determine with confidence PDFs at different perturbative orders and using a variety of experimental datasets ranging from HERA-only up to a global set including the latest LHC results, all using precisely the same validated methodology. We explore some of the phenomenological implications of our results for the upcoming 13 TeV Run of the LHC, in particular for Higgs production cross-sections.

Citations (2,743)

Summary

  • The paper introduces NNPDF3.0, which leverages a novel methodology and rigorous closure tests to minimize systematic uncertainties in PDF extraction.
  • The analysis integrates extensive datasets from HERA, DIS, and LHC measurements across LO, NLO, and NNLO with electroweak corrections.
  • Methodological enhancements in preprocessing, positivity constraints, and genetic algorithm minimization yield PDFs with reduced uncertainties for key LHC observables.

Essay on NNPDF3.0: Parton Distributions for LHC Run II

The paper presents NNPDF3.0, an advanced set of parton distribution functions (PDFs) aimed at improving the analysis of data from the second run of the Large Hadron Collider (LHC). This project, undertaken by the NNPDF Collaboration, leverages a novel methodology validated through a rigorous closure test, providing a high level of confidence in the PDF extraction from global datasets and addressing previous methodological uncertainties.

NNPDF3.0 is constructed using a comprehensive dataset, including HERA-II deep-inelastic scattering (DIS) cross-sections, HERA charm production data, and a variety of LHC measurements such as jet production, vector boson rapidity and transverse momentum distributions, W+W+charm production, and top quark pair production. This extensive dataset is analyzed at leading order (LO), next-to-leading order (NLO), and next-to-next-to-leading order (NNLO) of QCD perturbation theory, with electroweak corrections included, enhancing the precision and reliability of the PDFs for LHC applications.

A core innovation of the NNPDF3.0 methodology is the improved treatment of uncertainties. The methodology undergoes a meticulous closure test, demonstrating that PDFs derived from pseudo-data with statistically known properties correctly reproduce theoretical expectations given experimental uncertainties. This ensures that methodological uncertainties are negligible compared to theoretical and experimental uncertainties, thereby augmenting the robustness of the PDF results.

The paper outlines various significant improvements in the NNPDF approach. These include the systematic determination of preprocessing exponents, refined positivity constraints ensuring physically meaningful PDFs, and optimized minimization strategies using a genetic algorithm, which are crucial for efficiently handling the complexity of the PDF fitting problem. This suite of methodological advances is shown to yield PDF sets with reduced uncertainties compared to previous global fits, particularly in the regions relevant to high-energy LHC processes such as Higgs production in gluon fusion.

The practical implications of NNPDF3.0 are substantial, offering an enhanced tool for precise predictions of key LHC observables, including Higgs cross-sections and top quark processes. This facilitates stringent tests of the Standard Model and probes for new physics, reinforcing the critical role of accurate parton distributions in high-energy physics.

Finally, the paper speculates on future developments and areas for further research, emphasizing how the NNPDF3.0 sets establish a new standard for PDF determination in the LHC era, while highlighting the ongoing importance of refining input datasets and theoretical inputs to meet the demands of next-generation collider experiments.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.