Papers
Topics
Authors
Recent
2000 character limit reached

Attention-Infused Autoencoder for Massive MIMO CSI Compression (2504.12440v1)

Published 16 Apr 2025 in eess.SP

Abstract: As the number of multiple-input multiple-output (MIMO) antennas increases drastically with the development towards 6G systems, channel state information (CSI) compression becomes crucial for mitigating feedback overhead. In recent years, learning models such as autoencoders (AE) have been studied for CSI compression, aiming to eliminate model assumptions and reduce compression loss. However, current learning methods are often designed and trained mainly for individual channel scenarios, with limited generalizability across different scenarios, of which the channel characteristics are prominently discrepant. Motivated by this, we propose a novel AE-based learning method named attention-infused autoencoder network (AiANet), which can parallelly and adaptively extract channel-wise and spatial features of CSI with an attention fusion mechanism. In addition, a locally-aware self-attention mechanism is developed to extract both global and local spatial patterns, to better capture the unique CSI features of different scenarios. Moreover, a mixed-training scheme is introduced to enable the proposed AiANet to gain generalizability across indoor and outdoor scenarios. Results show that when trained and tested in the same scenario, AiANet can substantially outperform the existing AE-based methods such as ACRNet, with an improvement of up to 3.42 dB in terms of normalized mean squared error (NMSE). With the mixed-training scheme, AiANet exhibits superior cross-scenario generalizability compared to the benchmark methods which are trained in one scenario and misused in another.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.