Papers
Topics
Authors
Recent
AI Research Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 89 tok/s
Gemini 2.5 Pro 43 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 24 tok/s Pro
GPT-4o 112 tok/s Pro
Kimi K2 199 tok/s Pro
GPT OSS 120B 449 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

PDF4LHC recommendations for LHC Run II (1510.03865v2)

Published 13 Oct 2015 in hep-ph and hep-ex

Abstract: We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. We finally discuss tools which allow for the delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.

Citations (1,610)

Summary

  • The paper presents updated recommendations that combine multiple PDF sets using LHC Run I data to significantly reduce uncertainties.
  • It introduces unified methodologies and improved heavy quark schemes, such as GM-VFN, to enhance convergence in high-energy predictions.
  • Incorporating alpha_s(m_Z^2) variations, the work aligns theoretical cross-section calculations with empirical observations for more reliable analyses.

Overview of PDF4LHC Recommendations for LHC Run II

The document under review presents the updated PDF4LHC (Parton Distribution Functions for the Large Hadron Collider) recommendations aimed for the second run of the LHC. It outlines methodological advancements and new data integration for improved parton distribution functions essential for precise theoretical predictions at the LHC. These recommendations are a product of collaborations among key physics groups, including CT14 from CTEQ, MMHT from the MMHT collaboration, and NNPDF from the NNPDF collaboration. These sets are combined, offering a systematic blend of the strengths of each, while adopting a framework that refines both the understanding and application of PDF uncertainties forecasting in the experimental context of LHC Run II.

Context and Challenges

The accurate determination of PDFs is crucial for the interpretation of LHC data, given the complexities of the interactions occurring at high energies. The reason is that most cross-sections at the LHC—ranging from vector boson production to Higgs boson mechanisms—are sensitive to the underlying partonic structure of the proton. Previously, PDFs relied heavily on data from fixed-target experiments and older collider data, with variations noted even among the major PDF sets due to computational and methodological differences.

Run I data from the LHC provided an empirical bedrock to reassess these PDFs. The need was to integrate this new data effectively while addressing disparities in PDF sets that could skew theoretical predictions. The principal challenge lay in understanding variations in central predictions and uncertainties, especially concerning how these PDFs estimate quantities like gluon distribution at high momentum fractions or smaller features within the structure of quarks and antiquarks.

Updates in PDF Determination

The document details significant progress in PDF determination facilitated by new data from LHC Run I and improved methodologies. A vital advancement is the development of a more unified approach to combine PDF information, leveraging common computational tools and shared perspectives among different groups.

New Data and Methodological Advances: Introduction of LHC data reduced uncertainties significantly, enabling a more cohesive description of PDFs. For example, Run I data provided unprecedented precision for assessing high-energy interactions, particularly in terms of gluon distribution and quark flavor separations, especially pertaining to the strange quark contributions.

Theoretical Consistency and Heavy Quark Schemes: The treatment of heavy quark production has diversified, allowing for FFN (fixed flavor number) and GM-VFN (general-mass variable flavor number) schemes. GM-VFN's efficacy lies in improving convergence at higher Q2Q^2 values, which are particularly pivotal in Higgs production calculations. This consistency reduces discrepancies noted in older runs, narrowing the gap between theory and observed data.

The Inclusion of αs(mZ2)\alpha_s(m_Z^2) Variations: Part of the innovation involves accounting for uncertainties in the strong coupling constant, αs(mZ2)\alpha_s(m_Z^2). Nuanced variations of this parameter are crucial for accurate prediction bands.

Implications for LHC Run II and Future Developments

A combined statistical approach for determining PDFs is now feasible with each set offering equal weight, representing a significant shift from previous methods which relied on creating envelopes to cover inherent variability. This aligns the theoretical predictions more coherently with experimental data, diminishing groundless sources of uncertainty.

Practical and Theoretical Significance: The new PDF4LHC recommendations enable precise computation of cross-sections across various processes at the LHC, aiding in the exploration of both standard and beyond-standard model physics. This is crucial not only for testing predictions but also for flagging anomalies that could indicate new physics.

Future Outlook: The underlying framework is adaptive, potentially incorporating further data from ongoing LHC runs and processing it in a manner that maintains the robustness of the statistical models in use. The emerging standard could also entail integrating NNLO contributions more substantially if computational tools catch up with theoretical advancements soon.

Final Thoughts

The document presents a technically detailed yet strategically crucial update to the field of PDF usage for high-energy physics. By unifying disparate approaches and implementing a systematic combination strategy, it effectively reduces uncertainty, enhances predictive power, and aligns strongly with empirical observations from LHC data. This ensures that theoretical predictions remain a powerful tool in unraveling the phenomena at play in and beyond the anticipated findings of LHC Run II. By setting robust guidelines for how these PDFs should be applied, future physicists can leverage these frameworks to explore the more intricate details of particle physics as seen in experimental arenas.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube