Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Local-Global Knowledge Distillation in Heterogeneous Federated Learning with Non-IID Data (2107.00051v2)

Published 30 Jun 2021 in cs.LG, cs.AI, and cs.DC

Abstract: Federated learning enables multiple clients to collaboratively learn a global model by periodically aggregating the clients' models without transferring the local data. However, due to the heterogeneity of the system and data, many approaches suffer from the "client-drift" issue that could significantly slow down the convergence of the global model training. As clients perform local updates on heterogeneous data through heterogeneous systems, their local models drift apart. To tackle this issue, one intuitive idea is to guide the local model training by the global teachers, i.e., past global models, where each client learns the global knowledge from past global models via adaptive knowledge distillation techniques. Coming from these insights, we propose a novel approach for heterogeneous federated learning, namely FedGKD, which fuses the knowledge from historical global models for local training to alleviate the "client-drift" issue. In this paper, we evaluate FedGKD with extensive experiments on various CV/NLP datasets (i.e., CIFAR-10/100, Tiny-ImageNet, AG News, SST5) and different heterogeneous settings. The proposed method is guaranteed to converge under common assumptions, and achieves superior empirical accuracy in fewer communication runs than five state-of-the-art methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Dezhong Yao (36 papers)
  2. Wanning Pan (2 papers)
  3. Yutong Dai (21 papers)
  4. Yao Wan (70 papers)
  5. Xiaofeng Ding (3 papers)
  6. Hai Jin (83 papers)
  7. Zheng Xu (73 papers)
  8. Lichao Sun (186 papers)
Citations (43)

Summary

We haven't generated a summary for this paper yet.