Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Federated learning with class imbalance reduction (2011.11266v1)

Published 23 Nov 2020 in cs.LG, cs.AI, and cs.DC

Abstract: Federated learning (FL) is a promising technique that enables a large amount of edge computing devices to collaboratively train a global learning model. Due to privacy concerns, the raw data on devices could not be available for centralized server. Constrained by the spectrum limitation and computation capacity, only a subset of devices can be engaged to train and transmit the trained model to centralized server for aggregation. Since the local data distribution varies among all devices, class imbalance problem arises along with the unfavorable client selection, resulting in a slow converge rate of the global model. In this paper, an estimation scheme is designed to reveal the class distribution without the awareness of raw data. Based on the scheme, a device selection algorithm towards minimal class imbalance is proposed, thus can improve the convergence performance of the global model. Simulation results demonstrate the effectiveness of the proposed algorithm.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Miao Yang (6 papers)
  2. Akitanoshou Wong (1 paper)
  3. Hongbin Zhu (5 papers)
  4. Haifeng Wang (194 papers)
  5. Hua Qian (6 papers)
Citations (107)

Summary

We haven't generated a summary for this paper yet.