Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Client-supervised Federated Learning: Towards One-model-for-all Personalization (2403.19499v1)

Published 28 Mar 2024 in cs.LG

Abstract: Personalized Federated Learning (PerFL) is a new machine learning paradigm that delivers personalized models for diverse clients under federated learning settings. Most PerFL methods require extra learning processes on a client to adapt a globally shared model to the client-specific personalized model using its own local data. However, the model adaptation process in PerFL is still an open challenge in the stage of model deployment and test time. This work tackles the challenge by proposing a novel federated learning framework to learn only one robust global model to achieve competitive performance to those personalized models on unseen/test clients in the FL system. Specifically, we design a new Client-Supervised Federated Learning (FedCS) to unravel clients' bias on instances' latent representations so that the global model can learn both client-specific and client-agnostic knowledge. Experimental study shows that the FedCS can learn a robust FL global model for the changing data distributions of unseen/test clients. The FedCS's global model can be directly deployed to the test clients while achieving comparable performance to other personalized FL methods that require model adaptation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. “Communication-efficient learning of deep networks from decentralized data,” in Artificial intelligence and statistics. PMLR, 2017, pp. 1273–1282.
  2. “Adaptive personalized federated learning,” arXiv preprint arXiv:2003.13461, 2020.
  3. “Three approaches for personalization with applications to federated learning,” arXiv preprint arXiv:2002.10619, 2020.
  4. “Motley: Benchmarking heterogeneity and personalization in federated learning,” arXiv preprint arXiv:2206.09262, 2022.
  5. “Fine-tuning is fine in federated learning,” arXiv preprint arXiv:2108.07313, 2021.
  6. “Fedavg with fine tuning: Local updates lead to representation learning,” arXiv preprint arXiv:2205.13692, 2022.
  7. “Improving federated learning personalization via model agnostic meta learning,” arXiv preprint arXiv:1909.12488, 2019.
  8. “Personalized federated learning: A meta-learning approach,” arXiv preprint arXiv:2002.07948, 2020.
  9. “Ditto: Fair and robust federated learning through personalization,” in International Conference on Machine Learning. PMLR, 2021, pp. 6357–6368.
  10. “Federated learning with partial model personalization,” arXiv preprint arXiv:2204.03809, 2022.
  11. “Fedbn: Federated learning on non-iid features via local batch normalization,” arXiv preprint arXiv:2102.07623, 2021.
  12. “Exploiting shared representations for personalized federated learning,” arXiv preprint arXiv:2102.07078, 2021.
  13. “Disentangled federated learning for tackling attributes skew via invariant aggregation and diversity transferring,” arXiv preprint arXiv:2206.06818, 2022.
  14. “Upfl: Unsupervised personalized federated learning towards new clients,” 2023.
  15. “Feature distribution matching for federated domain generalization,” 2022.
  16. Pattern classification, John Wiley & Sons, 2006.
  17. “Fast incremental lda feature extraction,” Pattern Recognition, vol. 48, no. 6, pp. 1999–2012, 2015.
  18. “Measuring the effects of non-identical data distribution for federated visual classification,” 2019.
  19. C. Chatterjee and V.P. Roychowdhury, “On self-organizing algorithms and networks for class-separability features,” IEEE Transactions on Neural Networks, vol. 8, no. 3, pp. 663–678, 1997.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Peng Yan (88 papers)
  2. Guodong Long (115 papers)
Citations (1)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets