Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
12 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
37 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Specialized federated learning using a mixture of experts (2010.02056v3)

Published 5 Oct 2020 in cs.LG

Abstract: In federated learning, clients share a global model that has been trained on decentralized local client data. Although federated learning shows significant promise as a key approach when data cannot be shared or centralized, current methods show limited privacy properties and have shortcomings when applied to common real-world scenarios, especially when client data is heterogeneous. In this paper, we propose an alternative method to learn a personalized model for each client in a federated setting, with greater generalization abilities than previous methods. To achieve this personalization we propose a federated learning framework using a mixture of experts to combine the specialist nature of a locally trained model with the generalist knowledge of a global model. We evaluate our method on a variety of datasets with different levels of data heterogeneity, and our results show that the mixture of experts model is better suited as a personalized model for devices in these settings, outperforming both fine-tuned global models and local specialists.

Citations (26)

Summary

We haven't generated a summary for this paper yet.