Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 37 tok/s Pro
GPT-5 High 35 tok/s Pro
GPT-4o 125 tok/s Pro
Kimi K2 172 tok/s Pro
GPT OSS 120B 457 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Multi-loop Integrand Reduction via Multivariate Polynomial Division (1312.1627v1)

Published 5 Dec 2013 in hep-ph and hep-th

Abstract: We present recent developments on the topic of the integrand reduction of scattering amplitudes. Integrand-level methods allow to express an amplitude as a linear combination of Master Integrals, by performing operations on the corresponding integrands. This approach has already been successfully applied and automated at one loop, and recently extended to higher loops. We describe a coherent framework based on simple concepts of algebraic geometry, such as multivariate polynomial division, which can be used in order to obtain the integrand decomposition of any amplitude at any loop order. In the one-loop case, we discuss an improved reduction algorithm, based on the application of the Laurent series expansion to the integrands, which has been implemented in the semi-numerical library Ninja. At two loops, we present the reduction of five-point amplitudes in N=4 SYM, with a unitarity-based construction of the integrand. We also describe the multi-loop divide-and-conquer approach, which can always be used to find the integrand decomposition of any Feynman graph, regardless of the form and the complexity of the integrand, with purely algebraic operations.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.