Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GMM-based Codebook Construction and Feedback Encoding in FDD Systems (2205.12002v1)

Published 24 May 2022 in cs.IT, eess.SP, and math.IT

Abstract: We propose a precoder codebook construction and feedback encoding scheme which is based on Gaussian mixture models (GMMs). In an offline phase, the base station (BS) first fits a GMM to uplink (UL) training samples. Thereafter, it designs a codebook in an unsupervised manner by exploiting the GMM's clustering capability. We design one codebook entry per GMM component. After offloading the GMM-but not the codebook-to the mobile terminal (MT) in the online phase, the MT utilizes the GMM to determine the best fitting codebook entry. To this end, no channel estimation is necessary at the MT. Instead, the MT's observed signal is used to evaluate how responsible each component of the GMM is for the signal. The feedback consists of the index of the GMM component with the highest responsibility and the BS then employs the corresponding codebook entry. Simulation results show that the proposed codebook design and feedback encoding scheme outperforms conventional Lloyd clustering based codebook design algorithms, especially in configurations with reduced pilot overhead.

Citations (3)

Summary

We haven't generated a summary for this paper yet.