2000 character limit reached
Communication Requirements for Generating Correlated Random Variables
Published 1 May 2008 in cs.IT, cs.GT, math.IT, and math.PR | (0805.0065v1)
Abstract: Two familiar notions of correlation are rediscovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wyner's "common information" coincides with the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This work characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.