Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

perm2vec: Graph Permutation Selection for Decoding of Error Correction Codes using Self-Attention (2002.02315v2)

Published 6 Feb 2020 in cs.IT, cs.LG, and math.IT

Abstract: Error correction codes are an integral part of communication applications, boosting the reliability of transmission. The optimal decoding of transmitted codewords is the maximum likelihood rule, which is NP-hard due to the curse of dimensionality. For practical realizations, sub-optimal decoding algorithms are employed; yet limited theoretical insights prevent one from exploiting the full potential of these algorithms. One such insight is the choice of permutation in permutation decoding. We present a data-driven framework for permutation selection, combining domain knowledge with machine learning concepts such as node embedding and self-attention. Significant and consistent improvements in the bit error rate are introduced for all simulated codes, over the baseline decoders. To the best of the authors' knowledge, this work is the first to leverage the benefits of the neural Transformer networks in physical layer communication systems.

Citations (8)

Summary

We haven't generated a summary for this paper yet.