Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the convergence of series of dependent random variables (2006.08171v1)

Published 15 Jun 2020 in math.PR

Abstract: Given a sequence $(X_n)$ of symmetrical random variables taking values in a Hilbert space, an interesting open problem is to determine the conditions under which the series $\sum_{n=1}\infty X_n$ is almost surely convergent. For independent random variables, it is well-known that if $\sum_{n=1}\infty \mathbb{E}(|X_n|2) <\infty$, then $\sum_{n=1}\infty X_n$ converges almost surely. This has been extended to some cases of dependent variables (namely negatively associated random variables) but in the general setting of dependent variables, the problem remains open. This paper considers the case where each variable $X_n$ is given as a linear combination $a_{n,1}Z_1+ \ldots +a_{n,n}Z_n$ where $(Z_n)$ is a sequence of independent symmetrical random variables of unit variance and $(a_{n,k})$ are constants. For Gaussian random variables, this is the general setting. We obtain a sufficient condition for the almost sure convergence of $\sum_{n=1}\infty X_n$ which is also sufficient for the almost sure convergence of $\sum_{n=1}\infty \pm X_n$ for all (non-random) changes of sign. The result is based on an important bound of the mean of the random variable $\sup(|X_1 + \ldots +X_k|: 1\leq k \leq n)$ which extends the classical L\'evy's inequality and has some independent interest.

Summary

We haven't generated a summary for this paper yet.