Client-supervised Federated Learning: Towards One-model-for-all Personalization (2403.19499v1)
Abstract: Personalized Federated Learning (PerFL) is a new machine learning paradigm that delivers personalized models for diverse clients under federated learning settings. Most PerFL methods require extra learning processes on a client to adapt a globally shared model to the client-specific personalized model using its own local data. However, the model adaptation process in PerFL is still an open challenge in the stage of model deployment and test time. This work tackles the challenge by proposing a novel federated learning framework to learn only one robust global model to achieve competitive performance to those personalized models on unseen/test clients in the FL system. Specifically, we design a new Client-Supervised Federated Learning (FedCS) to unravel clients' bias on instances' latent representations so that the global model can learn both client-specific and client-agnostic knowledge. Experimental study shows that the FedCS can learn a robust FL global model for the changing data distributions of unseen/test clients. The FedCS's global model can be directly deployed to the test clients while achieving comparable performance to other personalized FL methods that require model adaptation.
- “Communication-efficient learning of deep networks from decentralized data,” in Artificial intelligence and statistics. PMLR, 2017, pp. 1273–1282.
- “Adaptive personalized federated learning,” arXiv preprint arXiv:2003.13461, 2020.
- “Three approaches for personalization with applications to federated learning,” arXiv preprint arXiv:2002.10619, 2020.
- “Motley: Benchmarking heterogeneity and personalization in federated learning,” arXiv preprint arXiv:2206.09262, 2022.
- “Fine-tuning is fine in federated learning,” arXiv preprint arXiv:2108.07313, 2021.
- “Fedavg with fine tuning: Local updates lead to representation learning,” arXiv preprint arXiv:2205.13692, 2022.
- “Improving federated learning personalization via model agnostic meta learning,” arXiv preprint arXiv:1909.12488, 2019.
- “Personalized federated learning: A meta-learning approach,” arXiv preprint arXiv:2002.07948, 2020.
- “Ditto: Fair and robust federated learning through personalization,” in International Conference on Machine Learning. PMLR, 2021, pp. 6357–6368.
- “Federated learning with partial model personalization,” arXiv preprint arXiv:2204.03809, 2022.
- “Fedbn: Federated learning on non-iid features via local batch normalization,” arXiv preprint arXiv:2102.07623, 2021.
- “Exploiting shared representations for personalized federated learning,” arXiv preprint arXiv:2102.07078, 2021.
- “Disentangled federated learning for tackling attributes skew via invariant aggregation and diversity transferring,” arXiv preprint arXiv:2206.06818, 2022.
- “Upfl: Unsupervised personalized federated learning towards new clients,” 2023.
- “Feature distribution matching for federated domain generalization,” 2022.
- Pattern classification, John Wiley & Sons, 2006.
- “Fast incremental lda feature extraction,” Pattern Recognition, vol. 48, no. 6, pp. 1999–2012, 2015.
- “Measuring the effects of non-identical data distribution for federated visual classification,” 2019.
- C. Chatterjee and V.P. Roychowdhury, “On self-organizing algorithms and networks for class-separability features,” IEEE Transactions on Neural Networks, vol. 8, no. 3, pp. 663–678, 1997.
- Peng Yan (88 papers)
- Guodong Long (115 papers)