Papers
Topics
Authors
Recent
2000 character limit reached

Rényi Divergence and Majorization

Published 25 Jan 2010 in cs.IT and math.IT | (1001.4448v3)

Abstract: R\'enyi divergence is related to R\'enyi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon's entropy, and comes up in many settings. It was introduced by R\'enyi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of R\'enyi divergence, including its relation to some other distances. We show how R\'enyi divergence appears when the theory of majorization is generalized from the finite to the continuous setting. Finally, R\'enyi divergence plays a role in analyzing the number of binary questions required to guess the values of a sequence of random variables.

Citations (59)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.