Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Some strong limit theorems in averaging (2209.10364v4)

Published 21 Sep 2022 in math.PR

Abstract: The paper deals with the fast-slow motions setups in the discrete time $X\epsilon((n+1)\epsilon)=X\epsilon(n\epsilon)+\epsilon B(X\epsilon(n\epsilon),\xi(n))$, $n=0,1,...,[T/\epsilon]$ and the continuous time $\frac {dX\epsilon(t)}{dt}=B(X\epsilon(t),\xi(t/\epsilon)).\, t\in [0,T]$ where $B$ is a smooth vector function and $\xi$ is a sufficiently fast mixing stationary stochastic process. It is known since 1966 (Khasminskii) that if $\bar X$ is the averaged motion then $G\epsilon=\epsilon{-1/2}(X\epsilon-\bar X)$ weakly converges to a Gaussian process $G$. We will show that for each $\epsilon$ the processes $\xi$ and $G$ can be redefined on a sufficiently rich probability space without changing their distributions so that $E\sup_{0\leq t\leq T}|G\epsilon(t)-G(t)|{2M} =O(\epsilon{\delta})$, $\delta>0$ which gives also $O(\epsilon{\delta/3})$ Prokhorov distance estimate between the distributions of $G\epsilon$ and $G$. In the product case $B(x,\xi)=\Sigma(x)\xi$ we obtain almost sure convergence estimates of the form $\sup_{0\leq t\leq T}|G\epsilon(t)-G(t)|=O(\epsilon\delta)$ a.s., as well as the functional form of the law of iterated logarithm for $G\epsilon$. We note that our mixing assumptions are adapted to fast motions generated by important classes of dynamical systems.

Summary

We haven't generated a summary for this paper yet.