Large deviations for sums of multivariate stretched-exponential random variables: the few-big-jumps principle
Abstract: Large deviations for sums of i.i.d.\ random variables with stretched-exponential tails (also called Weibull or semi-exponential tails) have been well understood since the 60's, going back to Nagaev's seminal work. Many extensions in the $1$-dimensional setting have been developed since then, showing that such deviations are typically governed by a single big jump. In higher dimensions, a corresponding theory has remained largely undeveloped. This work provides such a multivariate extension and establishes large deviation results for sums of i.i.d.\ random vectors in $\mathbb{R}k$ under fairly general assumptions. Roughly speaking, for some $α\in(0,1)$, the log-probability of one random vector divided by $x$ exceeding a threshold $t$ in all components behaves asymptotically, for large $x$, as $xα$ times a negative infimum of a function $\mathcal{J}$. We prove large deviation results for sums of i.i.d.\ copies, where the rate function is given by a minimization of at most $k$ summands of $\mathcal{J}$. This establishes a few-big-jumps principle that generalizes the classical $1$-dimensional phenomenon: the deviation is typically realized by \emph{at most} $k$ independent vectors. The results are applied to absolute powers of multivariate Gaussian vectors as well as to various other examples. They also allow us to study random projections of high-dimensional $\ell_pN$-balls, revealing interesting insights about the appearance of light- and heavy-tailed distributions in high-dimensional geometry.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.