Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The MMI Decoder is Asymptotically Optimal for the Typical Random Code and for the Expurgated Code (2007.12225v1)

Published 23 Jul 2020 in cs.IT and math.IT

Abstract: We provide two results concerning the optimality of the maximum mutual information (MMI) decoder. First, we prove that the error exponents of the typical random codes under the optimal maximum likelihood (ML) decoder and the MMI decoder are equal. As a corollary to this result, we also show that the error exponents of the expurgated codes under the ML and the MMI decoders are equal. These results strengthen the well known result due to Csisz\'ar and K\"orner, according to which, these decoders achieve equal random coding error exponents, since the error exponents of the typical random code and the expurgated code are strictly higher than the random coding error exponents, at least at low coding rates. While the universal optimality of the MMI decoder, in the random-coding error exponent sense, is easily proven by commuting the expectation over the channel noise and the expectation over the ensemble, when it comes to typical and expurgated exponents, this commutation can no longer be carried out. Therefore, the proof of the universal optimality of the MMI decoder must be completely different and it turns out to be highly non-trivial.

Citations (5)

Summary

We haven't generated a summary for this paper yet.