Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Ensemble Estimation of Information Divergence (1601.06884v3)

Published 26 Jan 2016 in cs.IT and math.IT

Abstract: Recent work has focused on the problem of nonparametric estimation of information divergence functionals. Many existing approaches are restrictive in their assumptions on the density support set or require difficult calculations at the support boundary which must be known a priori. The MSE convergence rate of a leave-one-out kernel density plug-in divergence functional estimator for general bounded density support sets is derived where knowledge of the support boundary is not required. The theory of optimally weighted ensemble estimation is generalized to derive a divergence estimator that achieves the parametric rate when the densities are sufficiently smooth. The asymptotic distribution of this estimator and some guidelines for tuning parameter selection are provided. Based on the theory, an empirical estimator of R\'{e}nyi-$\alpha$ divergence is proposed that outperforms the standard kernel density plug-in estimator, especially in high dimension. The estimator is shown to be robust to the choice of tuning parameters. As an illustration, we use the estimator to estimate bounds on the Bayes error rate of a classification problem.

Citations (22)

Summary

We haven't generated a summary for this paper yet.