2000 character limit reached
Concentration Bounds for Discrete Distribution Estimation in KL Divergence (2302.06869v2)
Published 14 Feb 2023 in stat.ML, cs.DM, cs.IT, cs.LG, math.IT, and math.PR
Abstract: We study the problem of discrete distribution estimation in KL divergence and provide concentration bounds for the Laplace estimator. We show that the deviation from mean scales as $\sqrt{k}/n$ when $n \ge k$, improving upon the best prior result of $k/n$. We also establish a matching lower bound that shows that our bounds are tight up to polylogarithmic factors.