High-dimensional CLT: Improvements, Non-uniform Extensions and Large Deviations (1806.06153v3)
Abstract: Central limit theorems (CLTs) for high-dimensional random vectors with dimension possibly growing with the sample size have received a lot of attention in the recent times. Chernozhukov et al. (2017) proved a Berry--Esseen type result for high-dimensional averages for the class of hyperrectangles and they proved that the rate of convergence can be upper bounded by $n{-1/6}$ upto a polynomial factor of $\log p$ (where $n$ represents the sample size and $p$ denotes the dimension). Convergence to zero of the bound requires $\log7p = o(n)$. We improve upon their result which only requires $\log4p = o(n)$ (in the best case). This improvement is made possible by a sharper dimension-free anti-concentration inequality for Gaussian process on a compact metric space. In addition, we prove two non-uniform variants of the high-dimensional CLT based on the large deviation and non-uniform CLT results for random variables in a Banach space by Bentkus, Ra{\v c}kauskas, and Paulauskas. We apply our results in the context of post-selection inference in linear regression and of empirical processes.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.