Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dubhe: Towards Data Unbiasedness with Homomorphic Encryption in Federated Learning Client Selection (2109.04253v1)

Published 8 Sep 2021 in cs.CR, cs.DC, and cs.LG

Abstract: Federated learning (FL) is a distributed machine learning paradigm that allows clients to collaboratively train a model over their own local data. FL promises the privacy of clients and its security can be strengthened by cryptographic methods such as additively homomorphic encryption (HE). However, the efficiency of FL could seriously suffer from the statistical heterogeneity in both the data distribution discrepancy among clients and the global distribution skewness. We mathematically demonstrate the cause of performance degradation in FL and examine the performance of FL over various datasets. To tackle the statistical heterogeneity problem, we propose a pluggable system-level client selection method named Dubhe, which allows clients to proactively participate in training, meanwhile preserving their privacy with the assistance of HE. Experimental results show that Dubhe is comparable with the optimal greedy method on the classification accuracy, with negligible encryption and communication overhead.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shulai Zhang (6 papers)
  2. Zirui Li (43 papers)
  3. Quan Chen (91 papers)
  4. Wenli Zheng (6 papers)
  5. Jingwen Leng (50 papers)
  6. Minyi Guo (98 papers)
Citations (31)

Summary

We haven't generated a summary for this paper yet.