Random sections of ellipsoids and the power of random information (1901.06639v2)
Abstract: We study the circumradius of the intersection of an $m$-dimensional ellipsoid $\mathcal E$ with semi-axes $\sigma_1\geq\dots\geq \sigma_m$ with random subspaces of codimension $n$. We find that, under certain assumptions on $\sigma$, this random radius $\mathcal{R}n=\mathcal{R}_n(\sigma)$ is of the same order as the minimal such radius $\sigma{n+1}$ with high probability. In other situations $\mathcal{R}n$ is close to the maximum $\sigma_1$. The random variable $\mathcal{R}_n$ naturally corresponds to the worst-case error of the best algorithm based on random information for $L_2$-approximation of functions from a compactly embedded Hilbert space $H$ with unit ball $\mathcal E$. In particular, $\sigma_k$ is the $k$th largest singular value of the embedding $H\hookrightarrow L_2$. In this formulation, one can also consider the case $m=\infty$, and we prove that random information behaves very differently depending on whether $\sigma \in \ell_2$ or not. For $\sigma \notin \ell_2$ random information is completely useless, i.e., $\mathbb E[\mathcal{R}_n] = \sigma_1$. For $\sigma \in \ell_2$ the expected radius of random information tends to zero at least at rate $o(1/\sqrt{n})$ as $n\to\infty$. In the important case $\sigma_k \asymp k{-\alpha} \ln{-\beta}(k+1)$, where $\alpha > 0$ and $\beta\in\mathbb R$, we obtain that $$ \mathbb E [\mathcal{R}_n(\sigma)] \asymp \begin{cases} \sigma_1 & : \alpha<1/2 \,\text{ or }\, \beta\leq\alpha=1/2 \ \sigma_n \, \sqrt{\ln(n+1)} & : \beta>\alpha=1/2 \ \sigma{n+1} & : \alpha>1/2. \end{cases} $$ In the proofs we use a comparison result for Gaussian processes `a la Gordon, exponential estimates for sums of chi-squared random variables, and estimates for the extreme singular values of (structured) Gaussian random matrices. The upper bound is constructive. It is proven for the worst case error of a least squares estimator.