Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Common information revisited (1104.3207v3)

Published 16 Apr 2011 in cs.IT, cs.DM, math.CO, and math.IT

Abstract: One of the main notions of information theory is the notion of mutual information in two messages (two random variables in Shannon information theory or two binary strings in algorithmic information theory). The mutual information in $x$ and $y$ measures how much the transmission of $x$ can be simplified if both the sender and the recipient know $y$ in advance. G\'acs and K\"orner gave an example where mutual information cannot be presented as common information (a third message easily extractable from both $x$ and $y$). Then this question was studied in the framework of algorithmic information theory by An. Muchnik and A. Romashchenko who found many other examples of this type. K. Makarychev and Yu. Makarychev found a new proof of G\'acs--K\"orner results by means of conditionally independent random variables. The question about the difference between mutual and common information can be studied quantitatively: for a given $x$ and $y$ we look for three messages $a$, $b$, $c$ such that $a$ and $c$ are enough to reconstruct $x$, while $b$ and $c$ are enough to reconstruct $y$. In this paper: We state and prove (using hypercontractivity of product spaces) a quantitative version of G\'acs--K\"orner theorem; We study the tradeoff between $\abs{a}, \abs{b}, \abs{c}$ for a random pair $(x, y)$ such that Hamming distance between $x$ and $y$ is $\eps n$ (our bounds are almost tight); We construct "the worst possible" distribution on $(x, y)$ in terms of the tradeoff between $\abs{a}, \abs{b}, \abs{c}$.

Citations (3)

Summary

We haven't generated a summary for this paper yet.