Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 89 tok/s
Gemini 2.5 Pro 38 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 95 tok/s Pro
Kimi K2 202 tok/s Pro
GPT OSS 120B 469 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Reduction with Degenerate Gram matrix for One-loop Integrals (2205.03000v2)

Published 6 May 2022 in hep-ph

Abstract: An improved PV-reduction method for one-loop integrals with auxiliary vector $R$ has been proposed in \cite{Feng:2021enk,Hu:2021nia}. It has also been shown that the new method is a self-completed method in \cite{Feng:2022uqp}. Analytic reduction coefficients can be easily produced by recursion relations in this method, where the Gram determinant appears in denominators. The singularity caused by Gram determinant is a well-known fact and it is important to address these divergences in a given frame. In this paper, we propose a systematical algorithm to deal with this problem in our method. The key idea is that now the master integral of the highest topology will be decomposed into combinations of master integrals of lower topologies. By demanding the cancellation of divergence for obtained general reduction coefficients, we solve decomposition coefficients as a Taylor series of the Gram determinant. Moreover, the same idea can be applied to other kinds of divergences.

Citations (10)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.