Federated Class-Incremental Learning with New-Class Augmented Self-Distillation (2401.00622v3)
Abstract: Federated Learning (FL) enables collaborative model training among participants while guaranteeing the privacy of raw data. Mainstream FL methodologies overlook the dynamic nature of real-world data, particularly its tendency to grow in volume and diversify in classes over time. This oversight results in FL methods suffering from catastrophic forgetting, where the trained models inadvertently discard previously learned information upon assimilating new data. In response to this challenge, we propose a novel Federated Class-Incremental Learning (FCIL) method, named \underline{Fed}erated \underline{C}lass-Incremental \underline{L}earning with New-Class \underline{A}ugmented \underline{S}elf-Di\underline{S}tillation (FedCLASS). The core of FedCLASS is to enrich the class scores of historical models with new class scores predicted by current models and utilize the combined knowledge for self-distillation, enabling a more sufficient and precise knowledge transfer from historical models to current models. Theoretical analyses demonstrate that FedCLASS stands on reliable foundations, considering scores of old classes predicted by historical models as conditional probabilities in the absence of new classes, and the scores of new classes predicted by current models as the conditional probabilities of class scores derived from historical models. Empirical experiments demonstrate the superiority of FedCLASS over four baseline algorithms in reducing average forgetting rate and boosting global accuracy.
- Large scale distributed neural network training through online distillation. arXiv preprint arXiv:1804.03235, 2018.
- Emnist: Extending mnist to handwritten letters. In 2017 international joint conference on neural networks (IJCNN), pages 2921–2926. IEEE, 2017.
- A hierarchical knowledge transfer framework for heterogeneous federated learning. In IEEE INFOCOM 2023-IEEE Conference on Computer Communications, pages 1–10. IEEE, 2023.
- Federated class-incremental learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 10164–10173, 2022.
- No one left behind: Real-world federated class-incremental learning. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2023.
- Unsupervised domain adaptation by backpropagation. In International conference on machine learning, pages 1180–1189. PMLR, 2015.
- Federated learning for mobile keyboard prediction. arXiv preprint arXiv:1811.03604, 2018.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
- Fedml: A research library and benchmark for federated machine learning. arXiv preprint arXiv:2007.13518, 2020.
- Distillation-based semi-supervised federated learning for communication-efficient collaborative training with non-iid private data. IEEE Transactions on Mobile Computing, 22(1):191–205, 2023.
- Towards federated learning against noisy labels via local self-regularization. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, pages 862–873, 2022.
- Learning multiple layers of features from tiny images. 2009.
- Preservation of the global knowledge by not-true distillation in federated learning. Advances in Neural Information Processing Systems, 35:38461–38474, 2022.
- Fedmd: Heterogenous federated learning via model distillation. arXiv preprint arXiv:1910.03581, 2019.
- Federated optimization in heterogeneous networks. Proceedings of Machine Learning and Systems, 2:429–450, 2020.
- Fate: An industrial grade platform for collaborative learning with data protection. The Journal of Machine Learning Research, 22(1):10320–10325, 2021.
- Fedet: A communication-efficient federated class-incremental learning framework based on enhanced transformer. In Edith Elkind, editor, Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, pages 3984–3992. International Joint Conferences on Artificial Intelligence Organization, 8 2023. Main Track.
- Class-incremental learning: survey and performance evaluation on image classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(5):5513–5533, 2022.
- Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282. PMLR, 2017.
- Knowledge distillation for federated learning: a practical guide. arXiv preprint arXiv:2211.04742, 2022.
- Reading digits in natural images with unsupervised feature learning. 2011.
- icarl: Incremental classifier and representation learning. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 2001–2010, 2017.
- Adaptive federated optimization. International Conference on Learning Representations, 2021.
- Protection Regulation. Regulation (eu) 2016/679 of the european parliament and of the council. Regulation (eu), 679:2016, 2016.
- Exploring the distributed knowledge congruence in proxy-data-free federated distillation. arXiv preprint arXiv:2204.07028, 2022.
- Survey of knowledge distillation in federated edge learning. arXiv preprint arXiv:2301.05849, 2023.
- Fedict: Federated multi-task distillation for multi-access edge computing. IEEE Transactions on Parallel and Distributed Systems, pages 1–16, 2023.
- Fedcache: A knowledge cache-driven federated learning architecture for personalized edge intelligence. arXiv preprint arXiv:2308.07816, 2023.
- Agglomerative federated learning: Empowering larger model training via end-edge-cloud collaboration. In IEEE INFOCOM 2024-IEEE Conference on Computer Communications, pages 1–10. IEEE, 2024.
- Federated machine learning: Concept and applications. ACM Transactions on Intelligent Systems and Technology (TIST), 10(2):1–19, 2019.
- Fedgkd: Toward heterogeneous federated learning via global knowledge distillation. IEEE Transactions on Computers, 73(1):3–17, 2024.
- Be your own teacher: Improve the performance of convolutional neural networks via self distillation. In Proceedings of the IEEE/CVF international conference on computer vision, pages 3713–3722, 2019.
- Maintaining discrimination and fairness in class incremental learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 13208–13217, 2020.
- Deep class-incremental learning: A survey. arXiv preprint arXiv:2302.03648, 2023.