MoE-AMC: Enhancing Automatic Modulation Classification Performance Using Mixture-of-Experts (2312.02298v1)
Abstract: Automatic Modulation Classification (AMC) plays a vital role in time series analysis, such as signal classification and identification within wireless communications. Deep learning-based AMC models have demonstrated significant potential in this domain. However, current AMC models inadequately consider the disparities in handling signals under conditions of low and high Signal-to-Noise Ratio (SNR), resulting in an unevenness in their performance. In this study, we propose MoE-AMC, a novel Mixture-of-Experts (MoE) based model specifically crafted to address AMC in a well-balanced manner across varying SNR conditions. Utilizing the MoE framework, MoE-AMC seamlessly combines the strengths of LSRM (a Transformer-based model) for handling low SNR signals and HSRM (a ResNet-based model) for high SNR signals. This integration empowers MoE-AMC to achieve leading performance in modulation classification, showcasing its efficacy in capturing distinctive signal features under diverse SNR scenarios. We conducted experiments using the RML2018.01a dataset, where MoE-AMC achieved an average classification accuracy of 71.76% across different SNR levels, surpassing the performance of previous SOTA models by nearly 10%. This study represents a pioneering application of MoE techniques in the realm of AMC, offering a promising avenue for elevating signal classification accuracy within wireless communication systems.
- Automatic modulation classification: principles, algorithms and applications, John Wiley & Sons, 2015.
- “Deep residual learning for image recognition,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770–778.
- “Convolutional radio modulation recognition networks,” in Engineering Applications of Neural Networks: 17th International Conference, EANN 2016, Aberdeen, UK, September 2-5, 2016, Proceedings 17. Springer, 2016, pp. 213–226.
- “Over-the-air deep learning based radio signal classification,” IEEE Journal of Selected Topics in Signal Processing, vol. 12, no. 1, pp. 168–179, 2018.
- “Attention is all you need,” Advances in neural information processing systems, vol. 30, 2017.
- “Mcformer: A transformer based deep neural network for automatic modulation classification,” in 2021 IEEE Global Communications Conference (GLOBECOM). IEEE, 2021, pp. 1–6.
- “Feature fusion convolution-aided transformer for automatic modulation recognition,” IEEE Communications Letters, 2023.
- “Abandon locality: Frame-wise embedding aided transformer for automatic modulation recognition,” IEEE Communications Letters, vol. 27, no. 1, pp. 327–331, 2022.
- “Modulation classification of active attacks in internet of things: Lightweight mcbldn with spatial transformer network,” IEEE Internet of Things Journal, vol. 9, no. 19, pp. 19132–19146, 2022.
- “Signal modulation classification based on the transformer network,” IEEE Transactions on Cognitive Communications and Networking, vol. 8, no. 3, pp. 1348–1357, 2022.
- “Towards understanding mixture of experts in deep learning,” arXiv preprint arXiv:2208.02813, 2022.
- “Mixture of experts: a literature survey,” Artificial Intelligence Review, vol. 42, pp. 275–293, 2014.
- “R-fcn: Object detection via region-based fully convolutional networks,” Advances in neural information processing systems, vol. 29, 2016.
- “Bert: Pre-training of deep bidirectional transformers for language understanding,” arXiv preprint arXiv:1810.04805, 2018.
- “Language models are few-shot learners,” Advances in neural information processing systems, vol. 33, pp. 1877–1901, 2020.
- “Deep learning models for wireless signal classification with distributed low-cost spectrum sensors,” IEEE Transactions on Cognitive Communications and Networking, vol. 4, no. 3, pp. 433–445, 2018.
- “An efficient deep learning model for automatic modulation recognition based on parameter estimation and transformation,” IEEE Communications Letters, vol. 25, no. 10, pp. 3287–3290, 2021.
- “A spatiotemporal multi-channel learning framework for automatic modulation recognition,” IEEE Wireless Communications Letters, vol. 9, no. 10, pp. 1629–1632, 2020.
- “Adam: A method for stochastic optimization,” in ICLR, 2015.
- “Long short-term memory,” Neural computation, vol. 9, no. 8, pp. 1735–1780, 1997.
- “Gate-variants of gated recurrent unit (gru) neural networks,” in 2017 IEEE 60th international midwest symposium on circuits and systems (MWSCAS). IEEE, 2017, pp. 1597–1600.