On the minimax rate of the Gaussian sequence model under bounded convex constraints (2201.07329v5)
Abstract: We determine the exact minimax rate of a Gaussian sequence model under bounded convex constraints, purely in terms of the local geometry of the given constraint set $K$. Our main result shows that the minimax risk (up to constant factors) under the squared $\ell_2$ loss is given by $\epsilon{*2} \wedge \operatorname{diam}(K)2$ with \begin{align*} \epsilon* = \sup \bigg{\epsilon : \frac{\epsilon2}{\sigma2} \leq \log M{\operatorname{loc}}(\epsilon)\bigg}, \end{align*} where $\log M{\operatorname{loc}}(\epsilon)$ denotes the local entropy of the set $K$, and $\sigma2$ is the variance of the noise. We utilize our abstract result to re-derive known minimax rates for some special sets $K$ such as hyperrectangles, ellipses, and more generally quadratically convex orthosymmetric sets. Finally, we extend our results to the unbounded case with known $\sigma2$ to show that the minimax rate in that case is $\epsilon{*2}$.