Sharp High-dimensional Central Limit Theorems for Log-concave Distributions (2207.14536v4)
Abstract: Let $X_1,\dots,X_n$ be i.i.d. log-concave random vectors in $\mathbb Rd$ with mean 0 and covariance matrix $\Sigma$. We study the problem of quantifying the normal approximation error for $W=n{-1/2}\sum_{i=1}nX_i$ with explicit dependence on the dimension $d$. Specifically, without any restriction on $\Sigma$, we show that the approximation error over rectangles in $\mathbb Rd$ is bounded by $C(\log{13}(dn)/n){1/2}$ for some universal constant $C$. Moreover, if the Kannan-Lov\'asz-Simonovits (KLS) spectral gap conjecture is true, this bound can be improved to $C(\log{3}(dn)/n){1/2}$. This improved bound is optimal in terms of both $n$ and $d$ in the regime $\log n=O(\log d)$. We also give $p$-Wasserstein bounds with all $p\geq2$ and a Cram\'er type moderate deviation result for this normal approximation error, and they are all optimal under the KLS conjecture. To prove these bounds, we develop a new Gaussian coupling inequality that gives almost dimension-free bounds for projected versions of $p$-Wasserstein distance for every $p\geq2$. We prove this coupling inequality by combining Stein's method and Eldan's stochastic localization procedure.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.