Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 156 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 110 tok/s Pro
Kimi K2 212 tok/s Pro
GPT OSS 120B 436 tok/s Pro
Claude Sonnet 4.5 39 tok/s Pro
2000 character limit reached

LEAP: Optimization Hierarchical Federated Learning on Non-IID Data with Coalition Formation Game (2405.00579v1)

Published 1 May 2024 in cs.GT

Abstract: Although Hierarchical Federated Learning (HFL) utilizes edge servers (ESs) to alleviate communication burdens, its model performance will be degraded by non-IID data and limited communication resources. Current works often assume that data is uniformly distributed, which however contradicts the heterogeneity of IoT. Solutions of additional model training to check the data distribution inevitably increases computational costs and the risk of privacy leakage. The challenges in solving these issues are how to reduce the impact of non-IID data without involving raw data and how to rationalize the communication resource allocation for addressing straggler problem. To tackle these challenges, we propose a novel optimization method based on coaLition formation gamE and grAdient Projection, called LEAP. Specifically, we combine edge data distribution with coalition formation game innovatively to adjust the correlations between clients and ESs dynamically, which ensures optimal correlations. We further capture the client heterogeneity to achieve the rational bandwidth allocation from coalition perception and determine the optimal transmission power within specified delay constraints at client level. Experimental results on four real datasets show that LEAP is able to achieve 20.62% improvement in model accuracy compared to the state-of-the-art baselines. Moreover, LEAP effectively reduce transmission energy consumption by at least about 2.24 times.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. Communication-efficient learning of deep networks from decentralized data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, pages 1273–1282, Fort Lauderdale, USA, April 2017. PMLR.
  2. On the ICN-IoT with federated learning integration of communication: Concepts, security-privacy issues, applications, and future perspectives. Future Generation Computer Systems, 138:61–88, 2023.
  3. Federated learning for smart cities: A comprehensive survey. Sustainable Energy Technologies and Assessments, 55:102987, 2023.
  4. Homomorphic encryption-based privacy-preserving federated learning in iot-enabled healthcare system. IEEE Transactions on Network Science and Engineering, 10(5):2864–2880, 2023.
  5. Incentive mechanism design for joint resource allocation in blockchain-based federated learning. IEEE Transactions on Parallel and Distributed Systems, 34(5):1536–1547, 2023.
  6. Internet of vehicles in big data era. IEEE/CAA Journal of Automatica Sinica, 5(1):19–35, 2018.
  7. Multi-armed bandit-based client scheduling for federated learning. IEEE Transactions on Wireless Communications, 19(11):7108–7123, 2020.
  8. Fraug: Tackling federated learning with non-iid features via representation augmentation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 4849–4859, Paris, France, October, 2023. IEEE.
  9. Model compression and privacy preserving framework for federated learning. Future Generation Computer Systems, 140:376–389, 2023.
  10. Joint gradient sparsification and device scheduling for federated learning. IEEE Transactions on Green Communications and Networking, 7(3):1407–1419, 2023.
  11. Online optimization for over-the-air federated learning with energy harvesting. IEEE Transactions on Wireless Communications, 1-1, 2023.
  12. Client-edge-cloud hierarchical federated learning. ICC 2020 - 2020 IEEE International Conference on Communications, pages 1–6, Dublin, Ireland, June 2020. IEEE.
  13. Federated learning with additional mechanisms on clients to reduce communication costs. ArXiv Preprint, arXiv:1908.05891, 2019.
  14. Coalitional federated learning: Improving communication and training on non-iid data with selfish clients. IEEE Transactions on Services Computing, 16(4):2462–2476, 2023.
  15. Auction-based cluster federated learning in mobile edge computing systems. IEEE Transactions on Parallel and Distributed Systems, 34(4):1145–1158, 2023.
  16. Xor mixup: Privacy-preserving data augmentation for one-shot federated learning. ArXiv Preprint, arXiv:2006.05148, 2020.
  17. Communication-efficient federated learning for wireless edge intelligence in iot. IEEE Internet of Things Journal, 7(7):5986–5994, 2019.
  18. Adaptive network pruning for wireless federated learning. IEEE Wireless Communications Letters, 10(7):1572–1576, 2021.
  19. Reputation-aware hedonic coalition formation for efficient serverless hierarchical federated learning. IEEE Transactions on Parallel and Distributed Systems, 33(11):2675–2686, 2022.
  20. Tim van Erven, and Peter Harremoës. Rényi Divergence and Kullback-Leibler Divergence. IEEE Transactions on Information Theory, 60(7):3797–3820, 2014.
  21. M.L. Menéndez and J.A. Pardo and L. Pardo and M.C. Pardo. The jensen-shannon divergence. Journal of The Franklin Institute-engineering and Applied Mathematics, 334(2):307–318, 1997.
  22. Context awareness group buying in d2d networks: A coalition formation game-theoretic approach. IEEE Transactions on Vehicular Technology, 67(12):12259–12272, 2018.
  23. Design of two-level incentive mechanisms for hierarchical federated learning. ArXiv Preprint, arXiv:2304.04162, 2023.
  24. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
  25. Alex Krizhevsky. Learning multiple layers of features from tiny images. Handbook of Systemic Autoimmune Diseases, 1(4), 2009.
  26. Reading digits in natural images with unsupervised feature learning. Nips Workshop on Deep Learning & Unsupervised Feature Learning, 2011.
  27. Cinic-10 is not imagenet or cifar-10. ArXiv Preprint, arXiv:1810.03505, 2018.
  28. Decentralized edge intelligence: A dynamic resource allocation framework for hierarchical federated learning. IEEE Transactions on Parallel and Distributed Systems, 33(3):536–550, 2022.
  29. Incentive mechanism for horizontal federated learning based on reputation and reverse auction. WWW ’21, page 947–956, New York, NY, USA, 2021. Association for Computing Machinery. In Proceedings of the Web Conference 2021, pages 947–956, Ljubljana, Slovenia, USA, April, 2021. Association for Computing Machinery.
  30. Fedfaim: A model performance-based fair incentive mechanism for federated learning. IEEE Transactions on Big Data, 1-13, 2022.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 3 tweets and received 1 like.

Upgrade to Pro to view all of the tweets about this paper: