On Predictive Density Estimation under $α$-divergence Loss
Abstract: Based on $X \sim N_d(\theta, \sigma2_X I_d)$, we study the efficiency of predictive densities under $\alpha-$divergence loss $L_{\alpha}$ for estimating the density of $Y \sim N_d(\theta, \sigma2_Y I_d)$. We identify a large number of cases where improvement on a plug-in density are obtainable by expanding the variance, thus extending earlier findings applicable to Kullback-Leibler loss. The results and proofs are unified with respect to the dimension $d$, the variances $\sigma2_X$ and $\sigma2_Y$, the choice of loss $L_{\alpha}$; $\alpha \in (-1,1)$. The findings also apply to a large number of plug-in densities, as well as for restricted parameter spaces with $\theta \in \Theta \subset \mathbb{R}d$. The theoretical findings are accompanied by various observations, illustrations, and implications dealing for instance with robustness with respect to the model variances and simultaneous dominance with respect to the loss.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.