Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 149 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 112 tok/s Pro
Kimi K2 205 tok/s Pro
GPT OSS 120B 434 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Computation of attractor dimension and maximal sums of Lyapunov exponents using polynomial optimization (2510.14870v1)

Published 16 Oct 2025 in math.DS, math.OC, and nlin.CD

Abstract: Two approaches are presented for computing upper bounds on Lyapunov exponents and their sums, and on Lyapunov dimension, among all trajectories of a dynamical system governed by ordinary differential equations. The first approach expresses a sum of Lyapunov exponents as a time average in an augmented dynamical system and then applies methods for bounding time averages. This generalizes the work of Oeri & Goluskin (Nonlinearity 36:5378-5400, 2023), who bounded the single leading Lyapunov exponent. The second approach considers a different augmented dynamical system, where bounds on sums of Lyapunov exponents are implied by stability of certain sets, and such stability is verified using Lyapunov function methods. Both of our approaches also can be adapted to directly compute bounds on Lyapunov dimension, which in turn implies bounds on the fractal dimension of a global attractor. For systems of ordinary differential equations with polynomial right-hand sides, all of our bounding formulations lead to polynomial optimization problems with sum-of-squares constraints. These sum-of-squares problems can be solved computationally for any particular system to yield numerical bounds, provided the number of variables and degree of polynomials is not prohibitive. Most of our upper bounds are proven to be sharp under relatively weak assumptions. In the case of the polynomial optimization problems, sharpness means that upper bounds converge to the exact values as polynomial degrees are raised. Computational examples demonstrate upper bounds that are sharp to several digits, including for a six-dimensional dynamical system where sums of Lyapunov exponents are maximized on periodic orbits.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 9 likes.

Upgrade to Pro to view all of the tweets about this paper: