Papers
Topics
Authors
Recent
2000 character limit reached

On Exact and $\infty$-Rényi Common Informations

Published 30 Sep 2018 in cs.IT and math.IT | (1810.00295v5)

Abstract: Recently, two extensions of Wyner's common information\textemdash exact and R\'enyi common informations\textemdash were introduced respectively by Kumar, Li, and El Gamal (KLE), and the present authors. The class of common information problems involves determining the minimum rate of the common input to two independent processors needed to exactly or approximately generate a target joint distribution. For the exact common information problem, exact generation of the target distribution is required, while for Wyner's and $\alpha$-R\'enyi common informations, the relative entropy and R\'enyi divergence with order $\alpha$ were respectively used to quantify the discrepancy between the synthesized and target distributions. The exact common information is larger than or equal to Wyner's common information. However, it was hitherto unknown whether the former is strictly larger than the latter for some joint distributions. In this paper, we first establish the equivalence between the exact and $\infty$-R\'enyi common informations, and then provide single-letter upper and lower bounds for these two quantities. For doubly symmetric binary sources, we show that the upper and lower bounds coincide, which implies that for such sources, the exact and $\infty$-R\'enyi common informations are completely characterized. Interestingly, we observe that for such sources, these two common informations are strictly larger than Wyner's. This answers an open problem posed by KLE. Furthermore, we extend Wyner's, $\infty$-R\'enyi, and exact common informations to sources with countably infinite or continuous alphabets, including Gaussian sources.

Citations (4)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.