Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Learning based Efficient Symbol-Level Precoding Design for MU-MISO Systems (2104.09799v1)

Published 20 Apr 2021 in cs.IT, eess.SP, and math.IT

Abstract: The recently emerged symbol-level precoding (SLP) technique has been regarded as a promising solution in multi-user wireless communication systems, since it can convert harmful multi-user interference (MUI) into beneficial signals for enhancing system performance. However, the tremendous computational complexity of conventional symbol-level precoding designs severely hinders the practical implementations. In order to tackle this difficulty, we propose a novel deep learning (DL) based approach to efficiently design the symbol-level precoders. Particularly, in this correspondence, we consider a multi-user multi-input single-output (MU-MISO) downlink system. An efficient precoding neural network (EPNN) is introduced to optimize the symbol-level precoders for maximizing the minimum quality-of-service (QoS) of all users under the power constraint. Simulation results demonstrate that the proposed EPNN based SLP design can dramatically reduce the computing time at the price of slight performance loss compared with the conventional convex optimization based SLP design.

Citations (15)

Summary

We haven't generated a summary for this paper yet.