Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Timely Asynchronous Hierarchical Federated Learning: Age of Convergence (2306.12400v1)

Published 21 Jun 2023 in cs.IT, cs.DC, cs.LG, cs.NI, eess.SP, and math.IT

Abstract: We consider an asynchronous hierarchical federated learning (AHFL) setting with a client-edge-cloud framework. The clients exchange the trained parameters with their corresponding edge servers, which update the locally aggregated model. This model is then transmitted to all the clients in the local cluster. The edge servers communicate to the central cloud server for global model aggregation. The goal of each client is to converge to the global model, while maintaining timeliness of the clients, i.e., having optimum training iteration time. We investigate the convergence criteria for such a system with dense clusters. Our analysis shows that for a system of $n$ clients with fixed average timeliness, the convergence in finite time is probabilistically guaranteed, if the nodes are divided into $O(1)$ number of clusters, that is, if the system is built as a sparse set of edge servers with dense client bases each.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Communication-efficient learning of deep networks from decentralized data. In AISTATS, April 2017.
  2. PMF: A privacy-preserving human mobility prediction framework via federated learning. ACM Interact. Mob. Wearable Ubiq. Tech., 4(1):10–21, March 2020.
  3. Fleet: Online federated learning via staleness awareness and performance prediction. ACM Trans. Intell. Syst. Tech., 13(5):1–30, September 2022.
  4. A review of applications in federated learning. Comp. & Indust. Engineering, 149:1–15, November 2020.
  5. Advances and open problems in federated learning. Foundations and Trends in Machine Learning, 14(1–2):1–210, June 2021.
  6. Federated learning and next generation wireless communications: A survey on bidirectional relationship. Trans. on Emer. Telecomm. Tech., 33(7):e4458, October 2022.
  7. Communication-efficient device scheduling for federated learning using stochastic optimization. In IEEE Infocom, May 2022.
  8. Federated learning with hierarchical clustering of local updates to improve training on non-iid data. In IEEE IJCNN, July 2020.
  9. Demystifying why local aggregation helps: Convergence analysis of hierarchical SGD. In AAAI Conf. on Artificial Intelligence, March 2022.
  10. Communication-efficient federated learning with adaptive quantization. ACM Trans. Intell. Syst. Tech., 13(4):1–26, August 2022.
  11. Federated optimization in heterogeneous networks. In MLSys, June 2020.
  12. Scaffold: Stochastic controlled averaging for federated learning. In ICML, July 2020.
  13. FedGradNorm: Personalized federated gradient-normalized multi-task learning. In IEEE SPAWC, July 2022.
  14. Hierarchical over-the-air FedGradNorm. In Asilomar Conference, October 2022.
  15. Personalized federated multi-task learning over wireless fading channels. Algorithms, 15(11):421, November 2022.
  16. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. In NeurIPS, October 2020.
  17. Federated learning with flexible control. 2022. Available at arXiv:2212.08496.
  18. Asynchronous federated optimization. 2020. Available at arXiv:1903.03934.
  19. Client-edge-cloud hierarchical federated learning. IEEE ICC, June 2020.
  20. Hierarchical federated learning across heterogeneous cellular networks. In IEEE ICASSP, April 2020.
  21. X. Wang and Y. Wang. Asynchronous hierarchical federated learning. 2022. Available at arXiv:2206.00054.
  22. Semi-asynchronous hierarchical federated learning over mobile edge networks. IEEE Access, 11:18887–18899, December 2023.
  23. Federated learning with non-iid data. 2022. Available at arXiv:1806.00582.
  24. T. Nishio and R. Yonetani. Client selection for federated learning with heterogeneous resources in mobile edge. IEEE ICC, May 2018.
  25. M. M. Amiri and D. Gunduz. Machine learning at the wireless edge: Distributed stochastic gradient descent over-the-air. In IEEE ISIT, July 2019.
  26. r𝑟ritalic_rtop-k𝑘kitalic_k: A statistical estimation approach to distributed SGD. IEEE Jour. Sel. Areas Info. Theory, 1(3):897–907, November 2020.
  27. Coded federated learning. In IEEE Globecom, December 2019.
  28. Update aware device scheduling for federated learning at the wireless edge. In IEEE ISIT, June 2020.
  29. W. Chang and R. Tandon. Communication efficient federated learning over multiple access channels. 2020. Available at arXiv:2001.08737.
  30. Fedboost: A communication-efficient algorithm for federated learning. In ICML, July 2020.
  31. FedPara: Low-rank hadamard product for communication-efficient federated learning. In ICLR, May 2022.
  32. Communication-efficient federated learning with dual-side low-rank compression. 2021. Available at arXiv:2104.12416.
  33. S. Vithana and S. Ulukus. Efficient private federated submodel learning. In IEEE ICC, May 2022.
  34. Time minimization in hierarchical federated learning. In IEEE/ACM SEC, December 2022.
  35. A. Ali and A. Arafa. Delay sensitive hierarchical federated learning with stochastic local updates. 2023. Available at arXiv:2302.04851.
  36. Federated dropout—a simple approach for enabling federated learning on resource constrained devices. IEEE Wireless Comm. Lett., 11(5):923–927, February 2022.
  37. Adaptive federated dropout: Improving communication efficiency and generalization for federated learning. In IEEE Infocom, May 2021.
  38. FedOBD: Opportunistic block dropout for efficiently training large-scale neural networks through federated learning. 2022. Available at arXiv:2208.05174.
  39. Age-based scheduling policy for federated learning in mobile edge networks. In IEEE ICASSP, May 2020.
  40. B. Buyukates and S. Ulukus. Timely communication in federated learning. IEEE Infocom, May 2021.
  41. Age of information: A new concept, metric, and tool. In Foundations and Trends in Networking, volume 12, pages 162–259, November 2017.
  42. Age of information: A new metric for information freshness. In Age of Information, volume 12, pages 1–224, December 2019.
  43. Age of information: An introduction and survey. IEEE Jour. Sel. Areas in Comm., 39(5):1183–1210, May 2020.
  44. M. Bastopcu and S. Ulukus. Information freshness in cache updating systems. IEEE Trans. on Wireless Communications, 20(3):1861–1874, March 2021.
  45. Freshness based cache updating in parallel relay networks. In IEEE ISIT, July 2021.
  46. Order statistics. John Wiley & Sons, New York, 3rd edition, 2003.
  47. M. Bastopcu and S. Ulukus. Who should Google Scholar update more often? In IEEE Infocom, July 2020.
  48. P. Kaswan and S. Ulukus. Timely tracking of a remote dynamic source via multi-hop renewal updates. 2023. Available at arXiv:2304.01989.
  49. P. Kaswan and S. Ulukus. Age of information with non-Poisson updates in cache-updating networks, June 2023.
  50. Age-based coded computation for bias reduction in distributed learning. In IEEE Globecom, December 2020.
  51. Polynomially coded regression: Optimal straggler mitigation via data encoding. 2018. Available at arXiv:1805.09934.
Citations (2)

Summary

We haven't generated a summary for this paper yet.