Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bandit-based Communication-Efficient Client Selection Strategies for Federated Learning (2012.08009v1)

Published 14 Dec 2020 in cs.LG and cs.AI

Abstract: Due to communication constraints and intermittent client availability in federated learning, only a subset of clients can participate in each training round. While most prior works assume uniform and unbiased client selection, recent work on biased client selection has shown that selecting clients with higher local losses can improve error convergence speed. However, previously proposed biased selection strategies either require additional communication cost for evaluating the exact local loss or utilize stale local loss, which can even make the model diverge. In this paper, we present a bandit-based communication-efficient client selection strategy UCB-CS that achieves faster convergence with lower communication overhead. We also demonstrate how client selection can be used to improve fairness.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Osman Yağan (38 papers)
  2. Yae Jee Cho (15 papers)
  3. Samarth Gupta (12 papers)
  4. Gauri Joshi (73 papers)
Citations (62)

Summary

We haven't generated a summary for this paper yet.