Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Partially Personalized Federated Learning: Breaking the Curse of Data Heterogeneity (2305.18285v1)

Published 29 May 2023 in cs.LG, cs.AI, math.OC, and stat.ML

Abstract: We present a partially personalized formulation of Federated Learning (FL) that strikes a balance between the flexibility of personalization and cooperativeness of global training. In our framework, we split the variables into global parameters, which are shared across all clients, and individual local parameters, which are kept private. We prove that under the right split of parameters, it is possible to find global parameters that allow each client to fit their data perfectly, and refer to the obtained problem as overpersonalized. For instance, the shared global parameters can be used to learn good data representations, whereas the personalized layers are fine-tuned for a specific client. Moreover, we present a simple algorithm for the partially personalized formulation that offers significant benefits to all clients. In particular, it breaks the curse of data heterogeneity in several settings, such as training with local steps, asynchronous training, and Byzantine-robust training.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Konstantin Mishchenko (37 papers)
  2. Rustem Islamov (16 papers)
  3. Eduard Gorbunov (65 papers)
  4. Samuel Horváth (93 papers)
Citations (7)