Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Constant Weight Codes with Gabor Dictionaries and Bayesian Decoding for Massive Random Access (2207.13049v1)

Published 26 Jul 2022 in cs.IT, eess.SP, and math.IT

Abstract: This paper considers a general framework for massive random access based on sparse superposition coding. We provide guidelines for the code design and propose the use of constant-weight codes in combination with a dictionary design based on Gabor frames. The decoder applies an extension of approximate message passing (AMP) by iteratively exchanging soft information between an AMP module that accounts for the dictionary structure, and a second inference module that utilizes the structure of the involved constant-weight code. We apply the encoding structure to (i) the unsourced random access setting, where all users employ a common dictionary, and (ii) to the "sourced" random access setting with user-specific dictionaries. When applied to a fading scenario, the communication scheme essentially operates non-coherently, as channel state information is required neither at the transmitter nor at the receiver. We observe that in regimes of practical interest, the proposed scheme compares favorably with state-of-the art schemes, in terms of the (per-user) energy-per-bit requirement, as well as the number of active users that can be simultaneously accommodated in the system. Importantly, this is achieved with a considerably smaller size of the transmitted codewords, potentially yielding lower latency and bandwidth occupancy, as well as lower implementation complexity.

Citations (5)

Summary

We haven't generated a summary for this paper yet.