Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Bayesian Federated Learning Framework with Online Laplace Approximation (2102.01936v3)

Published 3 Feb 2021 in cs.LG, cs.AI, and cs.DC

Abstract: Federated learning (FL) allows multiple clients to collaboratively learn a globally shared model through cycles of model aggregation and local model training, without the need to share data. Most existing FL methods train local models separately on different clients, and then simply average their parameters to obtain a centralized model on the server side. However, these approaches generally suffer from large aggregation errors and severe local forgetting, which are particularly bad in heterogeneous data settings. To tackle these issues, in this paper, we propose a novel FL framework that uses online Laplace approximation to approximate posteriors on both the client and server side. On the server side, a multivariate Gaussian product mechanism is employed to construct and maximize a global posterior, largely reducing the aggregation errors induced by large discrepancies between local models. On the client side, a prior loss that uses the global posterior probabilistic parameters delivered from the server is designed to guide the local training. Binding such learning constraints from other clients enables our method to mitigate local forgetting. Finally, we achieve state-of-the-art results on several benchmarks, clearly demonstrating the advantages of the proposed method.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Liangxi Liu (4 papers)
  2. Xi Jiang (53 papers)
  3. Feng Zheng (117 papers)
  4. Hong Chen (230 papers)
  5. Guo-Jun Qi (76 papers)
  6. Heng Huang (189 papers)
  7. Ling Shao (244 papers)
Citations (47)