Papers
Topics
Authors
Recent
Search
2000 character limit reached

Generalized Common Informations: Measuring Commonness by the Conditional Maximal Correlation

Published 28 Oct 2016 in cs.IT, cs.CR, math.IT, math.PR, math.ST, and stat.TH | (1610.09289v3)

Abstract: In literature, different common informations were defined by G\'acs and K\"orner, by Wyner, and by Kumar, Li, and Gamal, respectively. In this paper, we define two generalized versions of common informations, named approximate and exact information-correlation functions, by exploiting the conditional maximal correlation as a commonness or privacy measure. These two generalized common informations encompass the notions of G\'acs-K\"orner's, Wyner's, and Kumar-Li-Gamal's common informations as special cases. Furthermore, to give operational characterizations of these two generalized common informations, we also study the problems of private sources synthesis and common information extraction, and show that the information-correlation functions are equal to the minimum rates of commonness needed to ensure that some conditional maximal correlation constraints are satisfied for the centralized setting versions of these problems. As a byproduct, the conditional maximal correlation has been studied as well.

Citations (15)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.