Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dynamic Heterogeneous Federated Learning with Multi-Level Prototypes (2312.09881v1)

Published 15 Dec 2023 in cs.LG and cs.AI

Abstract: Federated learning shows promise as a privacy-preserving collaborative learning technique. Existing heterogeneous federated learning mainly focuses on skewing the label distribution across clients. However, most approaches suffer from catastrophic forgetting and concept drift, mainly when the global distribution of all classes is extremely unbalanced and the data distribution of the client dynamically evolves over time. In this paper, we study the new task, i.e., Dynamic Heterogeneous Federated Learning (DHFL), which addresses the practical scenario where heterogeneous data distributions exist among different clients and dynamic tasks within the client. Accordingly, we propose a novel federated learning framework named Federated Multi-Level Prototypes (FedMLP) and design federated multi-level regularizations. To mitigate concept drift, we construct prototypes and semantic prototypes to provide fruitful generalization knowledge and ensure the continuity of prototype spaces. To maintain the model stability and consistency of convergence, three regularizations are introduced as training losses, i.e., prototype-based regularization, semantic prototype-based regularization, and federated inter-task regularization. Extensive experiments show that the proposed method achieves state-of-the-art performance in various settings.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. Towards model agnostic federated learning using knowledge distillation. arXiv preprint arXiv:2110.15210.
  2. Cluster-driven graph federated learning over multiple domains. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2749–2758.
  3. Leaf: A benchmark for federated settings. arXiv preprint arXiv:1812.01097.
  4. Learning imbalanced datasets with label-distribution-aware margin loss. Advances in Neural Information Processing Systems, 32.
  5. EMNIST: Extending MNIST to handwritten letters. In 2017 international joint conference on neural networks (IJCNN), 2921–2926. IEEE.
  6. Exploiting shared representations for personalized federated learning. In International Conference on Machine Learning, 2089–2099.
  7. A tutorial on the cross-entropy method. Annals of operations research, 134: 19–67.
  8. Deng, L. 2012. The mnist database of handwritten digit images for machine learning research [best of the web]. IEEE signal processing magazine, 29(6): 141–142.
  9. Federated class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 10164–10173.
  10. Federated Learning Based on Dynamic Regularization. In International Conference on Learning Representations.
  11. Feddc: Federated learning with non-iid data via local drift decoupling and correction. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 10112–10121.
  12. Girshick, R. 2015. Fast r-cnn. In Proceedings of the IEEE international conference on computer vision, 1440–1448.
  13. Rethinking Federated Learning With Domain Shift: A Prototype View. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 16312–16322.
  14. Deep residual learning for image recognition. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 5: 770–778.
  15. Learning multiple layers of features from tiny images.
  16. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems, 25.
  17. Preservation of the Global Knowledge by Not-True Distillation in Federated Learning. Advances in Neural Information Processing Systems.
  18. Fedmd: Heterogenous federated learning via model distillation. arXiv preprint arXiv:1910.03581.
  19. Model-contrastive federated learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 10713–10722.
  20. Ditto: Fair and robust federated learning through personalization. In International Conference on Machine Learning, 6357–6368. PMLR.
  21. Federated optimization in heterogeneous networks. Proceedings of Machine learning and systems, 2: 429–450.
  22. Think locally, act globally: Federated learning with local and global representations. arXiv preprint arXiv:2001.01523.
  23. Multi-center federated learning: clients clustering for better personalization. World Wide Web, 26(1): 481–500.
  24. No fear of heterogeneity: Classifier calibration for federated learning with non-iid data. Advances in Neural Information Processing Systems, 34: 5972–5984.
  25. Continual federated learning based on knowledge distillation. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, volume 3.
  26. Communication-efficient learning of deep networks from decentralized data. In Artificial Intelligence and Statistics, 1273–1282.
  27. Fedproc: Prototypical contrastive federated learning on non-iid data. Future Generation Computer Systems, 143: 93–104.
  28. Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints. IEEE transactions on neural networks and learning systems, 32(8): 3710–3722.
  29. Federated learning on heterogeneous and long-tailed data via classifier re-training with federated features. arXiv preprint arXiv:2204.13399.
  30. Federated learning on heterogeneous and long-tailed data via classifier re-training with federated features. International Joint Conference on Artificial Intelligence.
  31. Personalized federated learning with moreau envelopes. Advances in Neural Information Processing Systems, 33: 21394–21405.
  32. Clustered Vehicular Federated Learning: Process and Optimization. IEEE Transactions on Intelligent Transportation Systems.
  33. Fedproto: Federated prototype learning across heterogeneous clients. In AAAI Conference on Artificial Intelligence, volume 1, 3–19.
  34. Enhancing Semantic-Consistent Features and Transforming Discriminative Features for Generalized Zero-Shot Classifications. Applied Sciences, 12(24): 12642.
  35. Flop: Federated learning on medical datasets using partial networks. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 3845–3853.
  36. Federated continual learning with weighted inter-client transfer. In International Conference on Machine Learning, 12073–12086. PMLR.
  37. Fine-tuning global model via data-free knowledge distillation for non-iid federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 10174–10183.
  38. Residual-Prototype Generating Network for Generalized Zero-Shot Learning. Mathematics, 10(19): 3587.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Shunxin Guo (5 papers)
  2. Hongsong Wang (25 papers)
  3. Xin Geng (90 papers)

Summary

We haven't generated a summary for this paper yet.