Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Error Probability of Sparse Superposition Codes with Approximate Message Passing Decoding (1712.06866v4)

Published 19 Dec 2017 in cs.IT and math.IT

Abstract: Sparse superposition codes, or sparse regression codes (SPARCs), are a recent class of codes for reliable communication over the AWGN channel at rates approaching the channel capacity. Approximate message passing (AMP) decoding, a computationally efficient technique for decoding SPARCs, has been proven to be asymptotically capacity-achieving for the AWGN channel. In this paper, we refine the asymptotic result by deriving a large deviations bound on the probability of AMP decoding error. This bound gives insight into the error performance of the AMP decoder for large but finite problem sizes, giving an error exponent as well as guidance on how the code parameters should be chosen at finite block lengths. For an appropriate choice of code parameters, we show that for any fixed rate less than the channel capacity, the decoding error probability decays exponentially in $n/(\log n){2T}$, where $T$, the number of AMP iterations required for successful decoding, is bounded in terms of the gap from capacity.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Cynthia Rush (29 papers)
  2. Ramji Venkataramanan (45 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.