About the non-asymptotic behaviour of Bayes estimators (1402.3695v2)
Abstract: This paper investigates the {\em nonasymptotic} properties of Bayes procedures for estimating an unknown distribution from $n$ i.i.d.\ observations. We assume that the prior is supported by a model $(\scr{S},h)$ (where $h$ denotes the Hellinger distance) with suitable metric properties involving the number of small balls that are needed to cover larger ones. We also require that the prior put enough probability on small balls. We consider two different situations. The simplest case is the one of a parametric model containing the target density for which we show that the posterior concentrates around the true distribution at rate $1/\sqrt{n}$. In the general situation, we relax the parametric assumption and take into account a possible mispecification of the model. Provided that the Kullback-Leibler Information between the true distribution and $\scr{S}$ is finite, we establish risk bounds for the Bayes estimators.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.