Papers
Topics
Authors
Recent
Search
2000 character limit reached

On the within-family Kullback-Leibler risk in Gaussian Predictive models

Published 3 Dec 2012 in math.ST and stat.TH | (1212.0325v1)

Abstract: We consider estimating the predictive density under Kullback-Leibler loss in a high-dimensional Gaussian model. Decision theoretic properties of the within-family prediction error -- the minimal risk among estimates in the class $\mathcal{G}$ of all Gaussian densities are discussed. We show that in sparse models, the class $\mathcal{G}$ is minimax sub-optimal. We produce asymptotically sharp upper and lower bounds on the within-family prediction errors for various subfamilies of $\mathcal{G}$. Under mild regularity conditions, in the sub-family where the covariance structure is represented by a single data dependent parameter $\Shat=\dhat \cdot I$, the Kullback-Leiber risk has a tractable decomposition which can be subsequently minimized to yield optimally flattened predictive density estimates. The optimal predictive risk can be explicitly expressed in terms of the corresponding mean square error of the location estimate, and so, the role of shrinkage in the predictive regime can be determined based on point estimation theory results. Our results demonstrate that some of the decision theoretic parallels between predictive density estimation and point estimation regimes can be explained by second moment based concentration properties of the quadratic loss.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.