Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FedML-HE: An Efficient Homomorphic-Encryption-Based Privacy-Preserving Federated Learning System (2303.10837v3)

Published 20 Mar 2023 in cs.LG and cs.CR

Abstract: Federated Learning trains machine learning models on distributed devices by aggregating local model updates instead of local data. However, privacy concerns arise as the aggregated local models on the server may reveal sensitive personal information by inversion attacks. Privacy-preserving methods, such as homomorphic encryption (HE), then become necessary for FL training. Despite HE's privacy advantages, its applications suffer from impractical overheads, especially for foundation models. In this paper, we present FedML-HE, the first practical federated learning system with efficient HE-based secure model aggregation. FedML-HE proposes to selectively encrypt sensitive parameters, significantly reducing both computation and communication overheads during training while providing customizable privacy preservation. Our optimized system demonstrates considerable overhead reduction, particularly for large foundation models (e.g., ~10x reduction for ResNet-50, and up to ~40x reduction for BERT), demonstrating the potential for scalable HE-based FL deployment.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Weizhao Jin (8 papers)
  2. Yuhang Yao (32 papers)
  3. Shanshan Han (18 papers)
  4. Carlee Joe-Wong (69 papers)
  5. Srivatsan Ravi (34 papers)
  6. Salman Avestimehr (116 papers)
  7. Chaoyang He (46 papers)
  8. Jiajun Gu (2 papers)
Citations (34)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com