Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Universal Decoding for Source-Channel Coding with Side Information (1507.01255v2)

Published 5 Jul 2015 in cs.IT and math.IT

Abstract: We consider a setting of Slepian--Wolf coding, where the random bin of the source vector undergoes channel coding, and then decoded at the receiver, based on additional side information, correlated to the source. For a given distribution of the randomly selected channel codewords, we propose a universal decoder that depends on the statistics of neither the correlated sources nor the channel, assuming first that they are both memoryless. Exact analysis of the random-binning/random-coding error exponent of this universal decoder shows that it is the same as the one achieved by the optimal maximum a-posteriori (MAP) decoder. Previously known results on universal Slepian-Wolf source decoding, universal channel decoding, and universal source-channel decoding, are all obtained as special cases of this result. Subsequently, we further generalize the results in several directions, including: (i) finite-state sources and finite-state channels, along with a universal decoding metric that is based on Lempel-Ziv parsing, (ii) arbitrary sources and channels, where the universal decoding is with respect to a given class of decoding metrics, and (iii) full (symmetric) Slepian-Wolf coding, where both source streams are separately fed into random-binning source encoders, followed by random channel encoders, which are then jointly decoded by a universal decoder.

Citations (7)

Summary

We haven't generated a summary for this paper yet.