Papers
Topics
Authors
Recent
Search
2000 character limit reached

Quantized Polar Code Decoders: Analysis and Design

Published 27 Feb 2019 in cs.IT and math.IT | (1902.10395v1)

Abstract: Applications of massive machine-type communications, such as sensor networks, smart metering, 'internet-of-things', or process and factory automation, are forecast to have great economic impact in the next five to ten years. Low-complexity energy- and cost-efficient communication schemes are essential to enable large-scale deployments. To target these requirements, we study decoding of polar codes with coarse quantization in the short block length regime. In particular, we devise schemes to mitigate the impact of coarse quantization, which has not been adequately explored in prior works. We introduce the 3-level quantized successive cancellation (SC) and SC list (SCL) decoder. Coarse quantization of log-likelihood ratios (LLRs) leads to quantization of path metrics (PMs). Quantized PMs severely impact the list management of SCL decoders, and hence cause performance degradation. Two mitigation strategies are presented: 1.) Selecting the winning codeword from the decoder's list based on maximum-likelihood (ML) rather than PM. 2.) Utilizing statistical knowledge about the reliability of bit estimates in each decoding step to improve list management. We demonstrate the effectiveness of our techniques in simulations. In particular, our enhancements prove useful in the low code rate regime, where theory available in the literature predicts pronounced losses caused by quantization. Furthermore, we put our work into perspective by comparing it to finer quantization and partially unquantized schemes. This yields insights and recommendations as to which quantization schemes offer the best cost-benefit ratio for practical implementation.

Citations (5)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.