Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
132 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Entropy of Independent Experiments, Revisited (1704.09007v1)

Published 28 Apr 2017 in cs.IT and math.IT

Abstract: The weak law of large numbers implies that, under mild assumptions on the source, the Renyi entropy per produced symbol converges (in probability) towards the Shannon entropy rate. This paper quantifies the speed of this convergence for sources with independent (but not iid) outputs, generalizing and improving the result of Holenstein and Renner (IEEE Trans. Inform. Theory, 2011). (a) we characterize sources with \emph{slowest convergence} (for given entropy): their outputs are mixtures of a uniform distribution and a unit mass. (b) based on the above characterization, we establish faster convergences in \emph{high-entropy} regimes. We discuss how these improved bounds may be used to better quantify security of outputs of random number generators. In turn, the characterization of "worst" distributions can be used to derive sharp "extremal" inequalities between Renyi and Shannon entropy. The main technique is \emph{non-convex programming}, used to characterize distributions of possibly large exponential moments under certain entropy.

Citations (1)

Summary

We haven't generated a summary for this paper yet.