Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 175 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 38 tok/s Pro
GPT-5 High 37 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 180 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Unveiling the Power of Complex-Valued Transformers in Wireless Communications (2502.11151v1)

Published 16 Feb 2025 in eess.SP

Abstract: Utilizing complex-valued neural networks (CVNNs) in wireless communication tasks has received growing attention for their ability to provide natural and effective representation of complex-valued signals and data. However, existing studies typically employ complex-valued versions of simple neural network architectures. Not only they merely scratch the surface of the extensive range of modern deep learning techniques, theoretical understanding of the superior performance of CVNNs is missing. To this end, this paper aims to fill both the theoretical and practice gap of employing CVNNs in wireless communications. In particular, we provide a comprehensive description on the various operations in CVNNs and theoretically prove that the CVNN requires fewer layers than the real-valued counterpart to achieve a given approximation error of a continuous function. Furthermore, to advance CVNNs in the field of wireless communications, this paper focuses on the transformer model, which represents a more sophisticated deep learning architecture and has been shown to have excellent performance in wireless communications but only in its real-valued form. In this aspect, we propose a fundamental paradigm of complex-valued transformers for wireless communications. Leveraging this structure, we develop customized complex-valued transformers for three representative applications in wireless communications: channel estimation, user activity detection, and precoding design. These applications utilize transformers with varying levels of sophistication and span a variety of tasks, ranging from regression to classification, supervised to unsupervised learning, and specific module design to end-to-end design. Experimental results demonstrate the superior performance of the complex-valued transformers for the above three applications compared to other traditional real-valued neural network-based methods.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: