Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Carbon Copy onto Dirty Paper Channel with Statistically Equivalent States (1602.02206v1)

Published 6 Feb 2016 in cs.IT and math.IT

Abstract: Costa's "writing on dirty paper" capacity result establishes that full state pre-cancellation can be attained in Gelfand-Pinsker channel with additive state and additive Gaussian noise. The "carbon copy onto dirty paper" channel is the extension of Costa's model to the compound setting: M receivers each observe the sum of the channel input, Gaussian noise and one of M Gaussian state sequences and attempt to decode the same common message. The state sequences are all non-causally known at the transmitter which attempts to simultaneously pre-code its transmission against the channel state affecting each output. In this correspondence we derive the capacity to within 2.25 bits-per-channel-use of the carbon copying onto dirty paper channel in which the state sequences are statistically equivalent, having the same variance and the same pairwise correlation. For this channel capacity is approached by letting the channel input be the superposition of two codewords: a base codeword, simultaneously decoded at each user, and a top codeword which is pre-coded against the state realization at each user for a portion 1/M of the time. The outer bound relies on a recursive bounding in which incremental side information is provided at each receiver. This result represents a significant first step toward determining the capacity of the most general "carbon copy onto dirty paper" channel in which state sequences appearing in the different channel outputs have any jointly Gaussian distribution.

Summary

We haven't generated a summary for this paper yet.