Toward Scalable Generative AI via Mixture of Experts in Mobile Edge Networks (2402.06942v1)
Abstract: The advancement of generative artificial intelligence (GAI) has driven revolutionary applications like ChatGPT. The widespread of these applications relies on the mixture of experts (MoE), which contains multiple experts and selectively engages them for each task to lower operation costs while maintaining performance. Despite MoE, GAI faces challenges in resource consumption when deployed on user devices. This paper proposes mobile edge networks supported MoE-based GAI. We first review the MoE from traditional AI and GAI perspectives, including structure, principles, and applications. We then propose a framework that transfers subtasks to devices in mobile edge networks, aiding GAI model operation on user devices. We discuss challenges in this process and introduce a deep reinforcement learning based algorithm to select edge devices for subtask execution. Experimental results will show that our framework not only facilitates GAI's deployment on resource-limited devices but also generates higher-quality content compared to methods without edge network support.
- Y. Liu, H. Du, D. Niyato, J. Kang, S. Cui, X. Shen, and P. Zhang, “Optimizing mobile-edge ai-generated everything (aigx) services by prompt engineering: Fundamental, framework, and case study,” IEEE Network, 2023.
- Y. Shen, J. Shao, X. Zhang, Z. Lin, H. Pan, D. Li, J. Zhang, and K. B. Letaief, “Large language models empowered autonomous edge ai for connected intelligence,” IEEE Communications Magazine, 2024.
- L. Hu, Z. Liu, Z. Zhao, L. Hou, L. Nie, and J. Li, “A survey of knowledge enhanced pre-trained language models,” IEEE Transactions on Knowledge and Data Engineering, 2023.
- S. E. Yuksel, J. N. Wilson, and P. D. Gader, “Twenty years of mixture of experts,” IEEE transactions on neural networks and learning systems, vol. 23, no. 8, pp. 1177–1193, 2012.
- Z. Xue, G. Song, Q. Guo, B. Liu, Z. Zong, Y. Liu, and P. Luo, “Raphael: Text-to-image generation via large mixture of diffusion paths,” arXiv preprint arXiv:2305.18295, 2023.
- S. Masoudnia and R. Ebrahimpour, “Mixture of experts: a literature survey,” Artificial Intelligence Review, vol. 42, pp. 275–293, 2014.
- J. Wang, H. Du, D. Niyato, Z. Xiong, J. Kang, S. Mao et al., “Guiding ai-generated digital content with wireless perception,” arXiv preprint arXiv:2303.14624, 2023.
- J. Gao, Q. Cao, and Y. Chen, “Moe-amc: Enhancing automatic modulation classification performance using mixture-of-experts,” arXiv preprint arXiv:2312.02298, 2023.
- J. Zhang, Z. Tang, M. Li, D. Fang, P. Nurmi, and Z. Wang, “Crosssense: Towards cross-site and large-scale wifi sensing,” in Proceedings of the 24th annual international conference on mobile computing and networking, 2018, pp. 305–320.
- N. Du, Y. Huang, A. M. Dai, S. Tong, D. Lepikhin, Y. Xu, M. Krikun, Y. Zhou, A. W. Yu, O. Firat et al., “Glam: Efficient scaling of language models with mixture-of-experts,” in International Conference on Machine Learning. PMLR, 2022, pp. 5547–5569.
- Y. Chen, R. Wang, H. Jiang, S. Shi, and R. Xu, “Exploring the use of large language models for reference-free text quality evaluation: A preliminary empirical study,” arXiv preprint arXiv:2304.00723, 2023.
- H. Du, Z. Li, D. Niyato, J. Kang, Z. Xiong, D. I. Kim et al., “Enabling ai-generated content (aigc) services in wireless edge networks,” arXiv preprint arXiv:2301.03220, 2023.
- J. Wang, H. Du, Z. Tian, D. Niyato, J. Kang, and X. Shen, “Semantic-aware sensing information transmission for metaverse: A contest theoretic approach,” IEEE Transactions on Wireless Communications, 2023.
- J. Wang, H. Du, D. Niyato, J. Kang, S. Cui, X. Shen, and P. Zhang, “Generative ai for integrated sensing and communication: Insights from the physical layer perspective,” arXiv preprint arXiv:2310.01036, 2023.
- R. Zhang, H. Du, D. Niyato, J. Kang, Z. Xiong, A. Jamalipour, P. Zhang, and D. I. Kim, “Generative ai for space-air-ground integrated networks (sagin),” arXiv preprint arXiv:2311.06523, 2023.
- Jiacheng Wang (132 papers)
- Hongyang Du (154 papers)
- Dusit Niyato (671 papers)
- Jiawen Kang (204 papers)
- Zehui Xiong (177 papers)
- Dong In Kim (168 papers)
- Khaled B. Letaief (209 papers)