On Minimax Optimality of Sparse Bayes Predictive Density Estimates (1707.04380v2)
Abstract: We study predictive density estimation under Kullback-Leibler loss in $\ell_0$-sparse Gaussian sequence models. We propose proper Bayes predictive density estimates and establish asymptotic minimaxity in sparse models. A surprise is the existence of a phase transition in the future-to-past variance ratio $r$. For $r < r_0 = (\surd 5 - 1)/4$, the natural discrete prior ceases to be asymptotically optimal. Instead, for subcritical $r$, a `bi-grid' prior with a central region of reduced grid spacing recovers asymptotic minimaxity. This phenomenon seems to have no analog in the otherwise parallel theory of point estimation of a multivariate normal mean under quadratic loss. For spike-and-slab priors to have any prospect of minimaxity, we show that the sparse parameter space needs also to be magnitude constrained. Within a substantial range of magnitudes, spike-and-slab priors can attain asymptotic minimaxity.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.