Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mutual Information-Maximizing Quantized Belief Propagation Decoding of Regular LDPC Codes (1904.06666v5)

Published 14 Apr 2019 in cs.IT and math.IT

Abstract: In this paper, we propose a class of finite alphabet iterative decoder (FAID), called mutual information-maximizing quantized belief propagation (MIM-QBP) decoder, for decoding regular low-density parity-check (LDPC) codes. Our decoder follows the reconstruction-calculation-quantization (RCQ) decoding architecture that is widely used in FAIDs. We present the first complete and systematic design framework for the RCQ parameters, and prove that our design with sufficient precision at node update is able to maximize the mutual information between coded bits and exchanged messages. Simulation results show that the MIM-QBP decoder can always considerably outperform the state-of-the-art mutual information-maximizing FAIDs that adopt two-input single-output lookup tables for decoding. Furthermore, with only 3 bits being used for each exchanged message, the MIM-QBP decoder can outperform the floating-point belief propagation decoder at the high signal-to-noise ratio regions when testing on high-rate LDPC codes with a maximum of 10 and 30 iterations.

Citations (7)

Summary

We haven't generated a summary for this paper yet.