Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MoE-AMC: Enhancing Automatic Modulation Classification Performance Using Mixture-of-Experts (2312.02298v1)

Published 4 Dec 2023 in eess.SP, cs.CV, cs.LG, and stat.AP

Abstract: Automatic Modulation Classification (AMC) plays a vital role in time series analysis, such as signal classification and identification within wireless communications. Deep learning-based AMC models have demonstrated significant potential in this domain. However, current AMC models inadequately consider the disparities in handling signals under conditions of low and high Signal-to-Noise Ratio (SNR), resulting in an unevenness in their performance. In this study, we propose MoE-AMC, a novel Mixture-of-Experts (MoE) based model specifically crafted to address AMC in a well-balanced manner across varying SNR conditions. Utilizing the MoE framework, MoE-AMC seamlessly combines the strengths of LSRM (a Transformer-based model) for handling low SNR signals and HSRM (a ResNet-based model) for high SNR signals. This integration empowers MoE-AMC to achieve leading performance in modulation classification, showcasing its efficacy in capturing distinctive signal features under diverse SNR scenarios. We conducted experiments using the RML2018.01a dataset, where MoE-AMC achieved an average classification accuracy of 71.76% across different SNR levels, surpassing the performance of previous SOTA models by nearly 10%. This study represents a pioneering application of MoE techniques in the realm of AMC, offering a promising avenue for elevating signal classification accuracy within wireless communication systems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. Automatic modulation classification: principles, algorithms and applications, John Wiley & Sons, 2015.
  2. “Deep residual learning for image recognition,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770–778.
  3. “Convolutional radio modulation recognition networks,” in Engineering Applications of Neural Networks: 17th International Conference, EANN 2016, Aberdeen, UK, September 2-5, 2016, Proceedings 17. Springer, 2016, pp. 213–226.
  4. “Over-the-air deep learning based radio signal classification,” IEEE Journal of Selected Topics in Signal Processing, vol. 12, no. 1, pp. 168–179, 2018.
  5. “Attention is all you need,” Advances in neural information processing systems, vol. 30, 2017.
  6. “Mcformer: A transformer based deep neural network for automatic modulation classification,” in 2021 IEEE Global Communications Conference (GLOBECOM). IEEE, 2021, pp. 1–6.
  7. “Feature fusion convolution-aided transformer for automatic modulation recognition,” IEEE Communications Letters, 2023.
  8. “Abandon locality: Frame-wise embedding aided transformer for automatic modulation recognition,” IEEE Communications Letters, vol. 27, no. 1, pp. 327–331, 2022.
  9. “Modulation classification of active attacks in internet of things: Lightweight mcbldn with spatial transformer network,” IEEE Internet of Things Journal, vol. 9, no. 19, pp. 19132–19146, 2022.
  10. “Signal modulation classification based on the transformer network,” IEEE Transactions on Cognitive Communications and Networking, vol. 8, no. 3, pp. 1348–1357, 2022.
  11. “Towards understanding mixture of experts in deep learning,” arXiv preprint arXiv:2208.02813, 2022.
  12. “Mixture of experts: a literature survey,” Artificial Intelligence Review, vol. 42, pp. 275–293, 2014.
  13. “R-fcn: Object detection via region-based fully convolutional networks,” Advances in neural information processing systems, vol. 29, 2016.
  14. “Bert: Pre-training of deep bidirectional transformers for language understanding,” arXiv preprint arXiv:1810.04805, 2018.
  15. “Language models are few-shot learners,” Advances in neural information processing systems, vol. 33, pp. 1877–1901, 2020.
  16. “Deep learning models for wireless signal classification with distributed low-cost spectrum sensors,” IEEE Transactions on Cognitive Communications and Networking, vol. 4, no. 3, pp. 433–445, 2018.
  17. “An efficient deep learning model for automatic modulation recognition based on parameter estimation and transformation,” IEEE Communications Letters, vol. 25, no. 10, pp. 3287–3290, 2021.
  18. “A spatiotemporal multi-channel learning framework for automatic modulation recognition,” IEEE Wireless Communications Letters, vol. 9, no. 10, pp. 1629–1632, 2020.
  19. “Adam: A method for stochastic optimization,” in ICLR, 2015.
  20. “Long short-term memory,” Neural computation, vol. 9, no. 8, pp. 1735–1780, 1997.
  21. “Gate-variants of gated recurrent unit (gru) neural networks,” in 2017 IEEE 60th international midwest symposium on circuits and systems (MWSCAS). IEEE, 2017, pp. 1597–1600.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets