Papers
Topics
Authors
Recent
2000 character limit reached

On the Role of Quantization of Soft Information in GRAND

Published 25 Mar 2022 in cs.IT and math.IT | (2203.13552v3)

Abstract: In this work, we investigate guessing random additive noise decoding (GRAND) with quantized soft input. First, we analyze the achievable rate of ordered reliability bits GRAND (ORBGRAND), which uses the rank order of the reliability as quantized soft information. We show that multi-line ORBGRAND can approach capacity for any signal-to-noise ratio (SNR). We then introduce discretized soft GRAND (DSGRAND), which uses information from a conventional quantizer. Simulation results show that DSGRAND well approximates maximum-likelihood (ML) decoding with a number of quantization bits that is in line with current soft decoding implementations. For a (128,106) CRC-concatenated polar code, the basic ORBGRAND is able to match or outperform CRC-aided successive cancellation list (CA-SCL) decoding with codeword list size of 64 and 3 bits of quantized soft information, while DSGRAND outperforms CA-SCL decoding with a list size of 128 codewords. Both ORBGRAND and DSGRAND exhibit approximately an order of magnitude less average complexity and two orders of magnitude smaller memory requirements than CA-SCL.

Citations (10)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.