Papers
Topics
Authors
Recent
Search
2000 character limit reached

OCD-FL: A Novel Communication-Efficient Peer Selection-based Decentralized Federated Learning

Published 6 Mar 2024 in cs.LG and cs.DC | (2403.04037v2)

Abstract: The conjunction of edge intelligence and the ever-growing Internet-of-Things (IoT) network heralds a new era of collaborative machine learning, with federated learning (FL) emerging as the most prominent paradigm. With the growing interest in these learning schemes, researchers started addressing some of their most fundamental limitations. Indeed, conventional FL with a central aggregator presents a single point of failure and a network bottleneck. To bypass this issue, decentralized FL where nodes collaborate in a peer-to-peer network has been proposed. Despite the latter's efficiency, communication costs and data heterogeneity remain key challenges in decentralized FL. In this context, we propose a novel scheme, called opportunistic communication-efficient decentralized federated learning, a.k.a., OCD-FL, consisting of a systematic FL peer selection for collaboration, aiming to achieve maximum FL knowledge gain while reducing energy consumption. Experimental results demonstrate the capability of OCD-FL to achieve similar or better performances than the fully collaborative FL, while significantly reducing consumed energy by at least 30% and up to 80%.

Authors (2)
Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)
  1. D. Xu, T. Li, Y. Li, X. Su, S. Tarkoma, T. Jiang, J. Crowcroft, and P. Hui, “Edge intelligence: Empowering intelligence to the edge of network,” Proc. IEEE, vol. 109, no. 11, pp. 1778–1837, Nov. 2021.
  2. Q. Li, Z. Wen, Z. Wu, S. Hu, N. Wang, Y. Li, X. Liu, and B. He, “A survey on federated learning systems: Vision, hype and reality for data privacy and protection,” IEEE Trans. Knowl. Data Engineer., vol. 35, no. 4, pp. 3347–3366, 2023.
  3. L. Li, Y. Fan, and K.-Y. Lin, “A survey on federated learning,” in Proc. IEEE Int. Conf. Control & Autom. (ICCA), 2020, pp. 791–796.
  4. Z. Yang, M. Chen, W. Saad, C. S. Hong, and M. Shikh-Bahaei, “Energy efficient federated learning over wireless communication networks,” IEEE Trans. Wireless Commun., vol. 20, no. 3, pp. 1935–1949, 2021.
  5. X. Zhang, R. Chen, J. Wang, H. Zhang, and M. Pan, “Energy efficient federated learning over cooperative relay-assisted wireless networks,” in Proc. IEEE Glob. Commun. Conf., 2022, pp. 179–184.
  6. H. Wang, Z. Kaplan, D. Niu, and B. Li, “Optimizing federated learning on non-IID data with reinforcement learning,” in Proc. IEEE Conf. Comput. Commun., 2020, pp. 1698–1707.
  7. J. Han, A. F. Khan, S. Zawad, A. Anwar, N. B. Angel, Y. Zhou, F. Yan, and A. R. Butt, “Heterogeneity-aware adaptive federated learning scheduling,” in Proc. IEEE Int. Conf. Big Data, 2022, pp. 911–920.
  8. E. T. M. Beltrán, M. Q. Pérez, P. M. S. Sánchez, S. L. Bernal, G. Bovet, M. G. Pérez, G. M. Pérez, and A. H. Celdrán, “Decentralized federated learning: Fundamentals, state of the art, frameworks, trends, and challenges,” IEEE Commun. Surv. Tuts., pp. 1–1, 2023.
  9. J. Zheng, K. Li, E. Tovar, and M. Guizani, “Federated learning for energy-balanced client selection in mobile edge computing,” in 2021 International Wireless Communications and Mobile Computing (IWCMC), 2021, pp. 1942–1947.
  10. Z. Li, J. Lu, S. Luo, D. Zhu, Y. Shao, Y. Li, Z. Zhang, Y. Wang, and C. Wu, “Towards effective clustered federated learning: A peer-to-peer framework with adaptive neighbor matching,” IEEE Trans. Big Data, pp. 1–16, 2022.
  11. W. Liu, L. Chen, and W. Zhang, “Decentralized federated learning: Balancing communication and computing costs,” IEEE Trans. Sig. Info. Process. Netw., vol. 8, pp. 131–143, 2022.
  12. H. Friis, “A note on a simple transmission formula,” Proc. of IRE, vol. 34, no. 5, pp. 254–256, May 1946.
  13. A. I. Pérez-Neira and M. R. Campalans, “Chapter 2 - different views of spectral efficiency,” in Cross-Layer Resource Allocation in Wireless Communications, A. I. Pérez-Neira and M. R. Campalans, Eds.   Oxford: Academic Press, 2009, pp. 13–33. [Online]. Available: https://www.sciencedirect.com/science/article/pii/B9780123741417000026
  14. S. Ruder, “An overview of gradient descent optimization algorithms,” CoRR, vol. abs/1609.04747, 2016. [Online]. Available: http://arxiv.org/abs/1609.04747
  15. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” 2017.
  16. L. Deng, “The MNIST database of handwritten digit images for machine learning research,” IEEE Sig. Process. Mag., vol. 29, no. 6, pp. 141–142, 2012.
  17. A. Krizhevsky, “Learning multiple layers of features from tiny images,” University of Toronto, 05 2012.
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.