A distance theorem for inhomogenous random rectangular matrices (2408.06309v1)
Abstract: Let $A \in \mathbb{R}{n \times (n - d)}$ be a random matrix with independent uniformly anti-concentrated entries satisfying $\mathbb{E}\lvert A\rvert_{HS}2 \leq Kn(n-d)$ and let $H$ be the subspace spanned by $H$. Let $X \in \mathbb{R}n$ be a random vector with uniformly anti-concentrated entries. We show that when $1 \leq d \leq \lambda n/\log n$ the distance between between $X$ and $H$ satisfies the following following small ball probability estimate: [ \Pr\left( \text{dis}(X,H) \leq t\sqrt{d} \right) \leq (Ct){d} + e{-cn}, ] for some constants $\lambda,c,C > 0$. This extends the distance theorems of Rudelson and Vershynin, Livshyts, and Livshyts,Tikhomirov, and Vershynin by dropping any identical distribution assumptions about the entries of $X$ and $A$. Furthermore it can be applied to prove numerous results about random matrices in the inhomogenous setting. These include lower tail estimates on the smallest singular value of rectangular matrices and upper tail estimates on the smallest singular value of square matrices. To obtain a distance theorem for inhomogenous rectangular matrices we introduce a new tool for this new general ensemble of random matrices, Randomized Logarithmic LCD, a natural combination of the Randomized LCD, used in study of smallest singular values of inhomogenous square matrices, and of the Logarithmic LCD, used in the study of no-gaps delocalization of eigenvectors and the smallest singular values of Hermitian random matrices.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.