On Dimension-dependent concentration for convex Lipschitz functions in product spaces (2106.06121v3)
Abstract: Let $n\geq 1$, $K>0$, and let $X=(X_1,X_2,\dots,X_n)$ be a random vector in $\mathbb{R}n$ with independent $K$--subgaussian components. We show that for every $1$--Lipschitz convex function $f$ in $\mathbb{R}n$ (the Lipschitzness with respect to the Euclidean metric), $$ \max\big(\mathbb{P}\big{f(X)-{\rm Med}\,f(X)\geq t\big},\mathbb{P}\big{f(X)-{\rm Med}\,f(X)\leq -t\big}\big)\leq \exp\bigg( -\frac{c\,t2}{K2\log\big(2+\frac{ n}{t2/K2}\big)}\bigg),\quad t>0, $$ where $c>0$ is a universal constant. The estimates are optimal in the sense that for every $n\geq \tilde C$ and $t>0$ there exist a product probability distribution $X$ in $\mathbb{R}n$ with $K$--subgaussian components, and a $1$--Lipschitz convex function $f$, with $$ \mathbb{P}\big{\big|f(X)-{\rm Med}\,f(X)\big|\geq t\big}\geq \tilde c\,\exp\bigg( -\frac{\tilde C\,t2}{K2\log\big(2+\frac{n}{t2/K2}\big)}\bigg). $$ The obtained deviation estimates for subgaussian variables are in sharp contrast with the case of variables with bounded $|X_i|_{\psi_p}$--norms for $p\in[1,2)$.