2000 character limit reached
Informational Divergence and Entropy Rate on Rooted Trees with Probabilities
Published 10 Oct 2013 in cs.IT and math.IT | (1310.2882v1)
Abstract: Rooted trees with probabilities are used to analyze properties of a variable length code. A bound is derived on the difference between the entropy rates of the code and a memoryless source. The bound is in terms of normalized informational divergence. The bound is used to derive converses for exact random number generation, resolution coding, and distribution matching.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.