Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FedImpro: Measuring and Improving Client Update in Federated Learning (2402.07011v2)

Published 10 Feb 2024 in cs.LG, cs.AI, and cs.DC

Abstract: Federated Learning (FL) models often experience client drift caused by heterogeneous data, where the distribution of data differs across clients. To address this issue, advanced research primarily focuses on manipulating the existing gradients to achieve more consistent client models. In this paper, we present an alternative perspective on client drift and aim to mitigate it by generating improved local models. First, we analyze the generalization contribution of local training and conclude that this generalization contribution is bounded by the conditional Wasserstein distance between the data distribution of different clients. Then, we propose FedImpro, to construct similar conditional distributions for local training. Specifically, FedImpro decouples the model into high-level and low-level components, and trains the high-level portion on reconstructed feature distributions. This approach enhances the generalization contribution and reduces the dissimilarity of gradients in FL. Experimental results show that FedImpro can help FL defend against data heterogeneity and enhance the generalization performance of the model.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Zhenheng Tang (38 papers)
  2. Yonggang Zhang (36 papers)
  3. Shaohuai Shi (47 papers)
  4. Xinmei Tian (50 papers)
  5. Tongliang Liu (251 papers)
  6. Bo Han (282 papers)
  7. Xiaowen Chu (108 papers)
Citations (7)
X Twitter Logo Streamline Icon: https://streamlinehq.com