2000 character limit reached
Minimax Optimal Estimation of KL Divergence for Continuous Distributions (2002.11599v1)
Published 26 Feb 2020 in cs.IT, math.IT, and stat.ML
Abstract: Estimating Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator. Furthermore, we derive a lower bound of the minimax mean square error and show that kNN method is asymptotically rate optimal.