Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
132 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MIMO Channel as a Neural Function: Implicit Neural Representations for Extreme CSI Compression in Massive MIMO Systems (2403.13615v1)

Published 20 Mar 2024 in cs.IT, eess.SP, and math.IT

Abstract: Acquiring and utilizing accurate channel state information (CSI) can significantly improve transmission performance, thereby holding a crucial role in realizing the potential advantages of massive multiple-input multiple-output (MIMO) technology. Current prevailing CSI feedback approaches improve precision by employing advanced deep-learning methods to learn representative CSI features for a subsequent compression process. Diverging from previous works, we treat the CSI compression problem in the context of implicit neural representations. Specifically, each CSI matrix is viewed as a neural function that maps the CSI coordinates (antenna number and subchannel) to the corresponding channel gains. Instead of transmitting the parameters of the implicit neural functions directly, we transmit modulations based on the CSI matrix derived through a meta-learning algorithm. Modulations are then applied to a shared base network to generate the elements of the CSI matrix. Modulations corresponding to the CSI matrix are quantized and entropy-coded to further reduce the communication bandwidth, thus achieving extreme CSI compression ratios. Numerical results show that our proposed approach achieves state-of-the-art performance and showcases flexibility in feedback strategies.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (45)
  1. Y. Wang, Z. Gao, D. Zheng, S. Chen, D. Gunduz, and H. V. Poor, “Transformer-empowered 6G intelligent networks: From massive MIMO processing to semantic communication,” IEEE Wireless Communications, pp. 1–9, 2022.
  2. J. Guo, C.-K. Wen, S. Jin, and G. Y. Li, “Overview of deep learning-based CSI feedback in massive MIMO systems,” IEEE Transactions on Communications, vol. 70, no. 12, pp. 8017–8045, 2022.
  3. Y. Sun, W. Xu, L. Fan, G. Y. Li, and G. K. Karagiannidis, “Ancinet: An efficient deep learning approach for feedback compression of estimated CSI in massive MIMO systems,” IEEE Wireless Communications Letters, vol. 9, no. 12, pp. 2192–2196, 2020.
  4. C.-K. Wen, W.-T. Shih, and S. Jin, “Deep learning for massive MIMO CSI feedback,” IEEE Wireless Comms. Letters, vol. 7, no. 5, pp. 748–751, 2018.
  5. J. Guo, C.-K. Wen, S. Jin, and G. Y. Li, “Convolutional neural network-based multiple-rate compressive sensing for massive MIMO CSI feedback: Design, simulation, and analysis,” IEEE Transactions on Wireless Communications, vol. 19, no. 4, pp. 2827–2840, 2020.
  6. Z. Cao, W.-T. Shih, J. Guo, C.-K. Wen, and S. Jin, “Lightweight convolutional neural networks for CSI feedback in massive MIMO,” IEEE Communications Letters, vol. 25, no. 8, pp. 2624–2628, 2021.
  7. Y. Sun, W. Xu, L. Liang, N. Wang, G. Y. Li, and X. You, “A lightweight deep network for efficient CSI feedback in massive MIMO systems,” IEEE Wireless Comms. Letters, vol. 10, no. 8, pp. 1840–1844, 2021.
  8. S. Ji and M. Li, “CLNet: Complex input lightweight neural network designed for massive MIMO CSI feedback,” IEEE Wireless Communications Letters, vol. 10, no. 10, pp. 2318–2322, 2021.
  9. S. Tang, J. Xia, L. Fan, X. Lei, W. Xu, and A. Nallanathan, “Dilated convolution based CSI feedback compression for massive MIMO systems,” IEEE Transactions on Vehicular Technology, vol. 71, no. 10, pp. 11 216–11 221, 2022.
  10. Y. Cui, A. Guo, and C. Song, “Transnet: Full attention network for CSI feedback in FDD massive MIMO system,” IEEE Wireless Communications Letters, vol. 11, no. 5, pp. 903–907, 2022.
  11. M. Chen, J. Guo, C.-K. Wen, S. Jin, G. Y. Li, and A. Yang, “Deep learning-based implicit CSI feedback in massive MIMO,” IEEE Transactions on Communications, vol. 70, no. 2, pp. 935–950, 2021.
  12. D. J. Love, R. W. Heath, V. K. Lau, D. Gesbert, B. D. Rao, and M. Andrews, “An overview of limited feedback in wireless communication systems,” IEEE J. on Selected Areas in Comms., vol. 26, no. 8, pp. 1341–1365, 2008.
  13. P.-H. Kuo, H. Kung, and P.-A. Ting, “Compressive sensing based channel feedback protocols for spatially-correlated massive antenna arrays,” in IEEE Wireless Comms. and Netw. Conf. (WCNC), 2012, pp. 492–497.
  14. M. B. Mashhadi and D. Gündüz, “Deep learning for massive MIMO channel state acquisition and feedback,” Journal of Indian Inst Science, no. 100, p. 369–382, 2020.
  15. X. Li and H. Wu, “Spatio-temporal representation with deep neural recurrent network in MIMO CSI feedback,” IEEE Wireless Communications Letters, vol. 9, no. 5, pp. 653–657, 2020.
  16. Z. Chen, Z. Zhang, Z. Xiao, Z. Yang, and K.-K. Wong, “Viewing channel as sequence rather than image: A 2-d seq2seq approach for efficient MIMO-OFDM CSI feedback,” IEEE Transactions on Wireless Communications, 2023.
  17. M. B. Mashhadi, Q. Yang, and D. Gündüz, “Distributed deep convolutional compression for massive MIMO CSI feedback,” IEEE Trans. on Wireless Comms., vol. 20, no. 4, pp. 2621–2633, 2020.
  18. T. van Rozendaal, I. A. Huijben, and T. S. Cohen, “Overfitting for fun and profit: Instance-adaptive data compression,” ICLR, 2021.
  19. Y. Yang, R. Bamler, and S. Mandt, “Improving inference for neural image compression,” Adv. in Neural Info. Proc. Sys., vol. 33, pp. 573–584, 2020.
  20. B. Ronen, D. Jacobs, Y. Kasten, and S. Kritchman, “The convergence rate of neural networks for learned functions of different frequencies,” Adv. in Neural Info. Proc. Sys., vol. 32, 2019.
  21. V. Sitzmann, J. Martel, A. Bergman, D. Lindell, and G. Wetzstein, “Implicit neural representations with periodic activation functions,” Adv. in Neural Info. Proc. Sys., vol. 33, pp. 7462–7473, 2020.
  22. B. Mildenhall, P. P. Srinivasan, M. Tancik, J. T. Barron, R. Ramamoorthi, and R. Ng, “Nerf: Representing scenes as neural radiance fields for view synthesis,” Comms. of the ACM, vol. 65, no. 1, pp. 99–106, 2021.
  23. E. Dupont, H. Loya, M. Alizadeh, A. Goliński, Y. W. Teh, and A. Doucet, “Coin++: Neural compression across modalities,” arXiv preprint arXiv:2201.12904, 2022.
  24. J. J. Park, P. Florence, J. Straub, R. Newcombe, and S. Lovegrove, “Deepsdf: Learning continuous signed distance functions for shape representation,” in IEEE/CVF Conf. on Com. Vis. and Ptrn. Recog., 2019, pp. 165–174.
  25. Z. Chen and H. Zhang, “Learning implicit fields for generative shape modeling,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 5939–5948.
  26. M. Tancik, P. Srinivasan, B. Mildenhall, S. Fridovich-Keil, N. Raghavan, U. Singhal, R. Ramamoorthi, J. Barron, and R. Ng, “Fourier features let networks learn high frequency functions in low dimensional domains,” Adv. in Neural Info. Proc. Sys., vol. 33, pp. 7537–7547, 2020.
  27. M. Havasi, R. Peharz, and J. M. Hernández-Lobato, “Minimal random code learning: Getting bits back from compressed model parameters,” ICLR, 2018.
  28. M. Van Baalen, C. Louizos, M. Nagel, R. A. Amjad, Y. Wang, T. Blankevoort, and M. Welling, “Bayesian bits: Unifying quantization and pruning,” Advances in neural information processing systems, vol. 33, pp. 5741–5752, 2020.
  29. C. Finn, P. Abbeel, and S. Levine, “Model-agnostic meta-learning for fast adaptation of deep networks,” in International conference on machine learning.   PMLR, 2017, pp. 1126–1135.
  30. X. Bi, S. Li, C. Yu, and Y. Zhang, “A novel approach using convolutional transformer for massive MIMO CSI feedback,” IEEE Wireless Communications Letters, vol. 11, no. 5, pp. 1017–1021, 2022.
  31. Y. Yang, F. Gao, G. Y. Li, and M. Jian, “Deep learning-based downlink channel prediction for FDD massive MIMO system,” IEEE Communications Letters, vol. 23, no. 11, pp. 1994–1998, 2019.
  32. Y. Strümpler, J. Postels, R. Yang, L. V. Gool, and F. Tombari, “Implicit neural representations for image compression,” in European Conference on Computer Vision.   Springer, 2022, pp. 74–91.
  33. C. Gordon, S.-F. Chng, L. MacDonald, and S. Lucey, “On quantizing implicit neural representations,” in Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2023, pp. 341–350.
  34. Z. Guo, G. Flamich, J. He, Z. Chen, and J. M. Hernández-Lobato, “Compression with bayesian implicit neural representations,” arXiv preprint arXiv:2305.19185, 2023.
  35. M. Tancik, P. P. Srinivasan, B. Mildenhall, S. Fridovich-Keil, N. Raghavan, U. Singhal, R. Ramamoorthi, J. T. Barron, and R. Ng, “Fourier features let networks learn high frequency functions in low dimensional domains,” NeurIPS, 2020.
  36. D. Chu, “Polyphase codes with good periodic correlation properties (corresp.),” IEEE Transactions on Information Theory, vol. 18, no. 4, pp. 531–532, 1972.
  37. E. Perez, F. Strub, H. de Vries, V. Dumoulin, and A. C. Courville, “Film: Visual reasoning with a general conditioning layer,” in AAAI, 2018.
  38. E. R. Chan, M. Monteiro, P. Kellnhofer, J. Wu, and G. Wetzstein, “PI-GAN: Periodic implicit generative adversarial networks for 3d-aware image synthesis,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2021, pp. 5799–5809.
  39. I. Mehta, M. Gharbi, C. Barnes, E. Shechtman, R. Ramamoorthi, and M. Chandraker, “Modulated periodic activations for generalizable local functional representations,” in IEEE/CVF Int’l Conf. on Computer Vision, 2021, pp. 14 214–14 223.
  40. E. Dupont, Y. W. Teh, and A. Doucet, “Generative models as distributions of functions,” arXiv preprint arXiv:2102.04776, 2021.
  41. J. Rissanen and G. G. Langdon, “Arithmetic coding,” IBM Journal of research and development, vol. 23, no. 2, pp. 149–162, 1979.
  42. A. Alkhateeb, “DeepMIMO: A generic deep learning dataset for millimeter wave and massive MIMO applications,” arXiv preprint arXiv:1902.06435, 2019.
  43. M. Lightstone and S. K. Mitra, “Image-adaptive vector quantization in an entropy-constrained framework,” IEEE transactions on image processing, vol. 6, no. 3, pp. 441–450, 1997.
  44. A. Fallah, A. Mokhtari, and A. Ozdaglar, “On the convergence theory of gradient-based model-agnostic meta-learning algorithms,” in Int’l Conf. on Artificial Int. and Stats.   PMLR, 2020, pp. 1082–1092.
  45. M. B. Mashhadi, Q. Yang, and D. Gündüz, “CNN-based analog CSI feedback in FDD MIMO-OFDM systems,” in IEEE Int’l Conf. on Acoustics, Speech and Sig. Proc. (ICASSP), 2020, pp. 8579–8583.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com