Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Minimax Rate-Optimal Estimation of Divergences between Discrete Distributions (1605.09124v5)

Published 30 May 2016 in cs.IT, math.IT, math.ST, and stat.TH

Abstract: We study the minimax estimation of $\alpha$-divergences between discrete distributions for integer $\alpha\ge 1$, which include the Kullback--Leibler divergence and the $\chi2$-divergences as special examples. Dropping the usual theoretical tricks to acquire independence, we construct the first minimax rate-optimal estimator which does not require any Poissonization, sample splitting, or explicit construction of approximating polynomials. The estimator uses a hybrid approach which solves a problem-independent linear program based on moment matching in the non-smooth regime, and applies a problem-dependent bias-corrected plug-in estimator in the smooth regime, with a soft decision boundary between these regimes.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yanjun Han (71 papers)
  2. Jiantao Jiao (83 papers)
  3. Tsachy Weissman (106 papers)
Citations (42)

Summary

We haven't generated a summary for this paper yet.