Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stable limits for Markov chains via the Principle of Conditioning (1808.04329v1)

Published 13 Aug 2018 in math.PR

Abstract: We study limit theorems for partial sums of instantaneous functions of a homogeneous Markov chain on a general state space. The summands are heavy-tailed and the limits are stable distributions. The conditions imposed on the transition operator $P$ of the Markov chain ensure that the limit is the same as if the summands were independent. Such a~scheme admits a physical interpretation, as given in Jara et al. (Ann. Appl. Probab., 19 (2009), 2270--2300). We considerably extend the results of Jara et al., (ibid.) and Cattiaux and Manou-Abi (ESAIM Probab. Stat., 18 (2014), 468--486). We show that the theory holds under the assumption of operator uniform integrability in $L2$ of $P$ (a notion introduced by Wu (J. Funct. Anal., 172 (2000), 301--376)) plus the $L2$-spectral gap property. If we strengthen the uniform integrability in $L2$ to the hyperboundedness, then the $L2$-spectral gap property can be relaxed to the strong mixing at geometric rate (in practice: to geometric ergodicity). We provide an example of a Markov chain on a countable space that is uniformly integrable in $L2$ (and admits an $L2$-spectral gap), while it is not hyperbounded. Moreover, we show by example that hyperboundedness is still a weaker property than $\phi$-mixing, what enlarges the range of models of interest. What makes our assumptions working is a new, efficient version of the Principle of Conditioning that operates with conditional characteristic functions rather than predictable characteristics.

Summary

We haven't generated a summary for this paper yet.