Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mobility-Aware Cluster Federated Learning in Hierarchical Wireless Networks (2108.09103v1)

Published 20 Aug 2021 in cs.LG, cs.NI, cs.SY, and eess.SY

Abstract: Implementing federated learning (FL) algorithms in wireless networks has garnered a wide range of attention. However, few works have considered the impact of user mobility on the learning performance. To fill this research gap, firstly, we develop a theoretical model to characterize the hierarchical federated learning (HFL) algorithm in wireless networks where the mobile users may roam across multiple edge access points, leading to incompletion of inconsistent FL training. Secondly, we provide the convergence analysis of HFL with user mobility. Our analysis proves that the learning performance of HFL deteriorates drastically with highly-mobile users. And this decline in the learning performance will be exacerbated with small number of participants and large data distribution divergences among local data of users. To circumvent these issues, we propose a mobility-aware cluster federated learning (MACFL) algorithm by redesigning the access mechanism, local update rule and model aggregation scheme. Finally, we provide experiments to evaluate the learning performance of HFL and our MACFL. The results show that our MACFL can enhance the learning performance, especially for three different cases, namely, the case of users with non-independent and identical distribution data, the case of users with high mobility, and the cases with a small number of users.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Chenyuan Feng (13 papers)
  2. Howard H. Yang (65 papers)
  3. Deshun Hu (1 paper)
  4. Zhiwei Zhao (13 papers)
  5. Tony Q. S. Quek (237 papers)
  6. Geyong Min (35 papers)
Citations (64)

Summary

We haven't generated a summary for this paper yet.