Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 167 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 36 tok/s Pro
GPT-5 High 42 tok/s Pro
GPT-4o 97 tok/s Pro
Kimi K2 203 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 32 tok/s Pro
2000 character limit reached

Optimized Self-Similar Borel Summation (2311.13913v1)

Published 23 Nov 2023 in nlin.CD, cond-mat.other, hep-ph, math-ph, and math.MP

Abstract: The method of Fractional Borel Summation is suggested in conjunction with self-similar factor approximants. The method used for extrapolating asymptotic expansions at small variables to large variables, including the variables tending to infinity, is described. The method is based on the combination of optimized perturbation theory, self-similar approximation theory, and Borel-type transformations. General Borel Fractional transformation of the original series is employed. The transformed series is resummed in order to adhere to the asymptotic power laws. The starting point is the formulation of dynamics in the approximations space by employing the notion of self-similarity. The flow in the approximation space is controlled, and ``deep'' control is incorporated into the definitions of the self-similar approximants. The class of self-similar approximations, satisfying, by design, the power law behavior, such as the use of self-similar factor approximants, is chosen for the reasons of transparency, explicitness, and convenience. A detailed comparison of different methods is performed on a rather large set of examples, employing self-similar factor approximants, self-similar iterated root approximants, as well as the approximation technique of self-similarly modified Pade - Borel approximations.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper:

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube