Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Personalized Federated Learning via Amortized Bayesian Meta-Learning (2307.02222v1)

Published 5 Jul 2023 in cs.LG

Abstract: Federated learning is a decentralized and privacy-preserving technique that enables multiple clients to collaborate with a server to learn a global model without exposing their private data. However, the presence of statistical heterogeneity among clients poses a challenge, as the global model may struggle to perform well on each client's specific task. To address this issue, we introduce a new perspective on personalized federated learning through Amortized Bayesian Meta-Learning. Specifically, we propose a novel algorithm called \emph{FedABML}, which employs hierarchical variational inference across clients. The global prior aims to capture representations of common intrinsic structures from heterogeneous clients, which can then be transferred to their respective tasks and aid in the generation of accurate client-specific approximate posteriors through a few local updates. Our theoretical analysis provides an upper bound on the average generalization error and guarantees the generalization performance on unseen data. Finally, several empirical results are implemented to demonstrate that \emph{FedABML} outperforms several competitive baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shiyu Liu (32 papers)
  2. Shaogao Lv (11 papers)
  3. Dun Zeng (15 papers)
  4. Zenglin Xu (145 papers)
  5. Hui Wang (371 papers)
  6. Yue Yu (343 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.