Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Learning for Distributed Channel Feedback and Multiuser Precoding in FDD Massive MIMO (2007.06512v2)

Published 13 Jul 2020 in cs.IT, eess.SP, and math.IT

Abstract: This paper shows that deep neural network (DNN) can be used for efficient and distributed channel estimation, quantization, feedback, and downlink multiuser precoding for a frequency-division duplex massive multiple-input multiple-output system in which a base station (BS) serves multiple mobile users, but with rate-limited feedback from the users to the BS. A key observation is that the multiuser channel estimation and feedback problem can be thought of as a distributed source coding problem. In contrast to the traditional approach where the channel state information (CSI) is estimated and quantized at each user independently, this paper shows that a joint design of pilots and a new DNN architecture, which maps the received pilots directly into feedback bits at the user side then maps the feedback bits from all the users directly into the precoding matrix at the BS, can significantly improve the overall performance. This paper further proposes robust design strategies with respect to channel parameters and also a generalizable DNN architecture for varying number of users and number of feedback bits. Numerical results show that the DNN-based approach with short pilot sequences and very limited feedback overhead can already approach the performance of conventional linear precoding schemes with full CSI.

Citations (106)

Summary

We haven't generated a summary for this paper yet.