Papers
Topics
Authors
Recent
Search
2000 character limit reached

Entropy of Independent Experiments, Revisited

Published 28 Apr 2017 in cs.IT and math.IT | (1704.09007v1)

Abstract: The weak law of large numbers implies that, under mild assumptions on the source, the Renyi entropy per produced symbol converges (in probability) towards the Shannon entropy rate. This paper quantifies the speed of this convergence for sources with independent (but not iid) outputs, generalizing and improving the result of Holenstein and Renner (IEEE Trans. Inform. Theory, 2011). (a) we characterize sources with \emph{slowest convergence} (for given entropy): their outputs are mixtures of a uniform distribution and a unit mass. (b) based on the above characterization, we establish faster convergences in \emph{high-entropy} regimes. We discuss how these improved bounds may be used to better quantify security of outputs of random number generators. In turn, the characterization of "worst" distributions can be used to derive sharp "extremal" inequalities between Renyi and Shannon entropy. The main technique is \emph{non-convex programming}, used to characterize distributions of possibly large exponential moments under certain entropy.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.