Distributed Swarm Learning for Edge Internet of Things
Abstract: The rapid growth of Internet of Things (IoT) has led to the widespread deployment of smart IoT devices at wireless edge for collaborative machine learning tasks, ushering in a new era of edge learning. With a huge number of hardware-constrained IoT devices operating in resource-limited wireless networks, edge learning encounters substantial challenges, including communication and computation bottlenecks, device and data heterogeneity, security risks, privacy leakages, non-convex optimization, and complex wireless environments. To address these issues, this article explores a novel framework known as distributed swarm learning (DSL), which combines artificial intelligence and biological swarm intelligence in a holistic manner. By harnessing advanced signal processing and communications, DSL provides efficient solutions and robust tools for large-scale IoT at the edge of wireless networks.
- T. Li, A. K. Sahu, A. Talwalkar, and V. Smith, “Federated learning: Challenges, methods, and future directions,” IEEE Signal Process. Mag., vol.37, no.3, 2020.
- L. U. Khan, W. Saad, Z. Han, E. Hossain, and C. S. Hong, “Federated learning for Internet of Things: Recent advances, taxonomy, and open challenges,” IEEE Commun. Surveys Tuts., vol.23, no.3, 2021.
- X. Fan, Y. Wang, Y. Huo, and Z. Tian, “CB-DSL: Communication-efficient and byzantine-robust distributed swarm learning on non-i.i.d. data,” IEEE Trans. Cogn. Commun. Netw., vol.10, no.1, 2024.
- J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proc. Intl. Conf. Neural Netw., vol.4, 1995.
- B. Qolomany, K. Ahmad, A. Al-Fuqaha, and J. Qadir, “Particle swarm optimized federated learning for industrial IoT and smart city services,” in Proc. IEEE Global Commun. Conf., 2020.
- S. Park, Y. Suh, and J. Lee, “FedPSO: federated learning using particle swarm optimization to reduce communication costs,” Sensors, vol.21, no.2, 2021.
- P. Xu, Y. Wang, X. Chen, and Z. Tian, “COKE: Communication-censored decentralized kernel learning,” J. Mach. Learn. Research, vol.22, 2021.
- X. Fan, Y. Wang, Y. Huo, and Z. Tian, “Joint optimization of communications and federated learning over the air,” IEEE Trans. Wireless Commun., vol.21, no.6, 2022.
- X. Fan, Y. Wang, Y. Huo, and Z. Tian, “1-bit compressive sensing for efficient federated learning over the air,” IEEE Trans. Wireless Commun., vol.22, no.3, 2023.
- M. Clerc and J. Kennedy, “The particle swarm - explosion, stability, and convergence in a multidimensional complex space,” IEEE Trans. Evol. Comput., vol.6, no.1, 2002.
- T. Tuor, S. Wang, B. J. Ko, C. Liu, and K. K. Leung, “Overcoming noisy and irrelevant data in federated learning,” in Proc. 25th Intl. Conf. Pattern Recognit., 2020.
- W. Li, J. Chen, Z. Wang, Z. Shen, C. Ma, and X. Cui, “IFL-GAN: Improved federated learning generative adversarial network with maximum mean discrepancy model aggregation,” IEEE Trans. Neural Netw. Learning Syst., vol.34, no.12, 2023.
- X. Fan, Y. Wang, Y. Huo, and Z. Tian, “BEV-SGD: Best effort voting SGD against byzantine attacks for analog aggregation based federated learning over the air,” IEEE IoT J., vol.9, no.19, 2022.
- G. Zhu, Y. Wang, and K. Huang, “Broadband analog aggregation for low-latency federated edge learning,” IEEE Trans. Wireless Commun., vol.19, no.1, 2020.
- P. Xu, Y. Wang, X. Chen, and Z. Tian, “QC-ODKLA: Quantized and communication-censored online decentralized kernel learning via linearized ADMM,” IEEE Trans. Neural Netw. Learn. Syst., Early Access, 2023.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.