Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Wyner's Common Information: Generalizations and A New Lossy Source Coding Interpretation (1301.2237v1)

Published 10 Jan 2013 in cs.IT and math.IT

Abstract: Wyner's common information was originally defined for a pair of dependent discrete random variables. Its significance is largely reflected in, hence also confined to, several existing interpretations in various source coding problems. This paper attempts to both generalize its definition and to expand its practical significance by providing a new operational interpretation. The generalization is two-folded: the number of dependent variables can be arbitrary, so are the alphabet of those random variables. New properties are determined for the generalized Wyner's common information of N dependent variables. More importantly, a lossy source coding interpretation of Wyner's common information is developed using the Gray-Wyner network. In particular, it is established that the common information equals to the smallest common message rate when the total rate is arbitrarily close to the rate distortion function with joint decoding. A surprising observation is that such equality holds independent of the values of distortion constraints as long as the distortions are within some distortion region. Examples about the computation of common information are given, including that of a pair of dependent Gaussian random variables.

Citations (26)

Summary

We haven't generated a summary for this paper yet.