Papers
Topics
Authors
Recent
Search
2000 character limit reached

On the convergence of series of dependent random variables

Published 15 Jun 2020 in math.PR | (2006.08171v1)

Abstract: Given a sequence $(X_n)$ of symmetrical random variables taking values in a Hilbert space, an interesting open problem is to determine the conditions under which the series $\sum_{n=1}\infty X_n$ is almost surely convergent. For independent random variables, it is well-known that if $\sum_{n=1}\infty \mathbb{E}(|X_n|2) <\infty$, then $\sum_{n=1}\infty X_n$ converges almost surely. This has been extended to some cases of dependent variables (namely negatively associated random variables) but in the general setting of dependent variables, the problem remains open. This paper considers the case where each variable $X_n$ is given as a linear combination $a_{n,1}Z_1+ \ldots +a_{n,n}Z_n$ where $(Z_n)$ is a sequence of independent symmetrical random variables of unit variance and $(a_{n,k})$ are constants. For Gaussian random variables, this is the general setting. We obtain a sufficient condition for the almost sure convergence of $\sum_{n=1}\infty X_n$ which is also sufficient for the almost sure convergence of $\sum_{n=1}\infty \pm X_n$ for all (non-random) changes of sign. The result is based on an important bound of the mean of the random variable $\sup(|X_1 + \ldots +X_k|: 1\leq k \leq n)$ which extends the classical L\'evy's inequality and has some independent interest.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.