Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradual convergence for Langevin dynamics on a degenerate potential (2209.11026v4)

Published 22 Sep 2022 in math.PR, math-ph, math.DS, and math.MP

Abstract: In this paper, we study an ordinary differential equation with a degenerate global attractor at the origin, to which we add a white noise with a small parameter that regulates its intensity. Under general conditions, for any fixed intensity, as time tends to infinity, the solution of this stochastic dynamics converges exponentially fast in total variation distance to a unique equilibrium distribution. We suitably accelerate the random dynamics and show that the preceding convergence is gradual, that is, the function that associates to each fixed $t\geq 0$ the total variation distance between the accelerated random dynamics at time $t$ and its equilibrium distribution converges, as the noise intensity tends to zero, to a decreasing function with values in $(0,1)$. Moreover, we prove that this limit function for each fixed $t \geq 0$ corresponds to the total variation distance between the marginal, at time $t$, of a stochastic differential equation that comes down from infinity and its corresponding equilibrium distribution. This completes the classification of all possible behaviors of the total variation distance between the time marginal of the aforementioned stochastic dynamics and its invariant measure for one dimensional well-behaved convex potentials. In addition, there is no cut-off phenomenon for this one-parameter family of random processes and asymptotics of the mixing times are derived.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com