Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Federated Reconstruction: Partially Local Federated Learning (2102.03448v6)

Published 5 Feb 2021 in cs.LG and cs.DC

Abstract: Personalization methods in federated learning aim to balance the benefits of federated and local training for data availability, communication cost, and robustness to client heterogeneity. Approaches that require clients to communicate all model parameters can be undesirable due to privacy and communication constraints. Other approaches require always-available or stateful clients, impractical in large-scale cross-device settings. We introduce Federated Reconstruction, the first model-agnostic framework for partially local federated learning suitable for training and inference at scale. We motivate the framework via a connection to model-agnostic meta learning, empirically demonstrate its performance over existing approaches for collaborative filtering and next word prediction, and release an open-source library for evaluating approaches in this setting. We also describe the successful deployment of this approach at scale for federated collaborative filtering in a mobile keyboard application.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Karan Singhal (26 papers)
  2. Hakim Sidahmed (6 papers)
  3. Zachary Garrett (12 papers)
  4. Shanshan Wu (19 papers)
  5. Keith Rush (17 papers)
  6. Sushant Prakash (15 papers)
Citations (127)

Summary

We haven't generated a summary for this paper yet.