Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FedCor: Correlation-Based Active Client Selection Strategy for Heterogeneous Federated Learning (2103.13822v3)

Published 24 Mar 2021 in cs.LG and cs.DC

Abstract: Client-wise data heterogeneity is one of the major issues that hinder effective training in federated learning (FL). Since the data distribution on each client may vary dramatically, the client selection strategy can significantly influence the convergence rate of the FL process. Active client selection strategies are popularly proposed in recent studies. However, they neglect the loss correlations between the clients and achieve only marginal improvement compared to the uniform selection strategy. In this work, we propose FedCor -- an FL framework built on a correlation-based client selection strategy, to boost the convergence rate of FL. Specifically, we first model the loss correlations between the clients with a Gaussian Process (GP). Based on the GP model, we derive a client selection strategy with a significant reduction of expected global loss in each round. Besides, we develop an efficient GP training method with a low communication overhead in the FL scenario by utilizing the covariance stationarity. Our experimental results show that compared to the state-of-the-art method, FedCorr can improve the convergence rates by $34\%\sim 99\%$ and $26\%\sim 51\%$ on FMNIST and CIFAR-10, respectively.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Minxue Tang (10 papers)
  2. Xuefei Ning (52 papers)
  3. Yitu Wang (7 papers)
  4. Jingwei Sun (31 papers)
  5. Yu Wang (939 papers)
  6. Hai Li (159 papers)
  7. Yiran Chen (176 papers)
Citations (68)

Summary

We haven't generated a summary for this paper yet.