Papers
Topics
Authors
Recent
Search
2000 character limit reached

Efficient Deep Neural Receiver with Post-Training Quantization

Published 8 Aug 2025 in eess.SP | (2508.06275v1)

Abstract: Deep learning has recently garnered significant interest in wireless communications due to its superior performance compared to traditional model-based algorithms. Deep convolutional neural networks (CNNs) have demonstrated notable improvements in block error rate (BLER) under various channel models and mobility scenarios. However, the high computational complexity and resource demands of deep CNNs pose challenges for deployment in resource-constrained edge systems. The 3rd Generation Partnership Project (3GPP) Release 20 highlights the pivotal role of AI integration in enabling advanced radio-access networks for 6G systems. The hard real-time processing demands of 5G and 6G require efficient techniques such as post-training quantization (PTQ), quantization-aware training (QAT), pruning, and hybrid approaches to meet latency requirements. In this paper, we focus on PTQ to reduce model complexity by lowering the bit-width of weights, thereby enhancing computational efficiency. Our analysis employs symmetric uniform quantization, applying both per-tensor and per-channel PTQ to a neural receiver achieving performance comparable to full-precision models. Specifically, 8-bit per-channel quantization maintains BLER performance with minimal degradation, while 4-bit quantization shows great promise but requires further optimization to achieve target BLER levels. These results highlight the potential of ultra-low bitwidth PTQ for efficient neural receiver deployment in 6G systems.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.