Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 188 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 37 tok/s Pro
GPT-5 High 34 tok/s Pro
GPT-4o 102 tok/s Pro
Kimi K2 203 tok/s Pro
GPT OSS 120B 457 tok/s Pro
Claude Sonnet 4.5 32 tok/s Pro
2000 character limit reached

Low-Complexity Block-Based Decoding Algorithms for Short Block Channels (2404.10798v2)

Published 15 Apr 2024 in cs.IT, cs.ET, and math.IT

Abstract: This paper presents low-complexity block-based encoding and decoding algorithms for short block length channels. In terms of the precise use-case, we are primarily concerned with the baseline 3GPP Short block transmissions in which payloads are encoded by Reed-Muller codes and paired with orthogonal DMRS. In contemporary communication systems, the short block decoding often employs the utilization of DMRS-based least squares channel estimation, followed by maximum likelihood decoding. However, this methodology can incur substantial computational complexity when processing long bit length codes. We propose an innovative approach to tackle this challenge by introducing the principle of block/segment encoding using First-Order RM Codes which is amenable to low-cost decoding through block-based fast Hadamard transforms. The Block-based FHT has demonstrated to be cost-efficient with regards to decoding time, as it evolves from quadric to quasi-linear complexity with a manageable decline in performance. Additionally, by incorporating an adaptive DMRS/data power adjustment technique, we can bridge/reduce the performance gap and attain high sensitivity, leading to a good trade-off between performance and complexity to efficiently handle small payloads.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 2 tweets and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: