Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Analysis of Quasi-Cyclic LDPC codes under ML decoding over the erasure channel (1004.5217v1)

Published 29 Apr 2010 in cs.IT and math.IT

Abstract: In this paper, we show that Quasi-Cyclic LDPC codes can efficiently accommodate the hybrid iterative/ML decoding over the binary erasure channel. We demonstrate that the quasi-cyclic structure of the parity-check matrix can be advantageously used in order to significantly reduce the complexity of the ML decoding. This is achieved by a simple row/column permutation that transforms a QC matrix into a pseudo-band form. Based on this approach, we propose a class of QC-LDPC codes with almost ideal error correction performance under the ML decoding, while the required number of row/symbol operations scales as $k\sqrt{k}$, where $k$ is the number of source symbols.

Citations (10)

Summary

We haven't generated a summary for this paper yet.