Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
127 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
53 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
4 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Statistical estimation of the Kullback-Leibler divergence (1907.00196v1)

Published 29 Jun 2019 in math.ST and stat.TH

Abstract: Wide conditions are provided to guarantee asymptotic unbiasedness and L2-consistency of the introduced estimates of the Kullback-Leibler divergence for probability measures in Rd having densities w.r.t. the Lebesgue measure. These estimates are constructed by means of two independent collections of i.i.d. observations and involve the specified k-nearest neighbor statistics. In particular, the established results are valid for estimates of the Kullback-Leibler divergence between any two Gaussian measures in Rd with nondegenerate covariance matrices. As a byproduct we obtain new statements concerning the Kozachenko-Leonenko estimators of the Shannon differential entropy.

Summary

We haven't generated a summary for this paper yet.