Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Common Randomness Generation from Sources with Countable Alphabet (2210.04556v2)

Published 10 Oct 2022 in cs.IT and math.IT

Abstract: We study a standard two-source model for common randomness (CR) generation in which Alice and Bob generate a common random variable with high probability of agreement by observing independent and identically distributed (i.i.d.) samples of correlated sources on countably infinite alphabets. The two parties are additionally allowed to communicate as little as possible over a noisy memoryless channel. In our work, we give a single-letter formula for the CR capacity for the proposed model and provide a rigorous proof of it. This is a challenging scenario because some of the finite alphabet properties, namely of the entropy can not be extended to the countably infinite case. Notably, it is known that the Shannon entropy is in fact discontinuous at all probability distributions with countably infinite support.

Citations (2)

Summary

We haven't generated a summary for this paper yet.