Towards Efficient Replay in Federated Incremental Learning (2403.05890v3)
Abstract: In Federated Learning (FL), the data in each client is typically assumed fixed or static. However, data often comes in an incremental manner in real-world applications, where the data domain may increase dynamically. In this work, we study catastrophic forgetting with data heterogeneity in Federated Incremental Learning (FIL) scenarios where edge clients may lack enough storage space to retain full data. We propose to employ a simple, generic framework for FIL named Re-Fed, which can coordinate each client to cache important samples for replay. More specifically, when a new task arrives, each client first caches selected previous samples based on their global and local importance. Then, the client trains the local model with both the cached samples and the samples from the new task. Theoretically, we analyze the ability of Re-Fed to discover important samples for replay thus alleviating the catastrophic forgetting problem. Moreover, we empirically show that Re-Fed achieves competitive performance compared to state-of-the-art methods.
- Momentum strategies. The journal of Finance, 51(5):1681–1713, 1996.
- Domain-incremental continual learning for mitigating bias in facial expression and action unit recognition. ArXiv, abs/2103.08637, 2021.
- Emnist: Extending mnist to handwritten letters. In 2017 international joint conference on neural networks (IJCNN), pages 2921–2926. IEEE, 2017.
- Non-iid data and continual learning processes in federated learning: A long road ahead. Information Fusion, 88:263–280, 2022.
- Incremental task and motion planning: A constraint-based approach. In Robotics: Science and Systems, 2016.
- Federated class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 10164–10173, 2022.
- Federated learning for vehicular internet of things: Recent advances and open issues. IEEE Open Journal of the Computer Society, 1:45–61, 2020.
- Pathnet: Evolution channels gradient descent in super neural networks. ArXiv, abs/1701.08734, 2017.
- Domain-adversarial training of neural networks. The journal of machine learning research, 17(1):2096–2030, 2016.
- Federated learning of a mixture of global and local models. CoRR, abs/2002.05516, 2020.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
- Re-evaluating continual learning scenarios: A categorization and case for strong baselines. ArXiv, abs/1810.12488, 2018.
- Jonathan J. Hull. A database for handwritten text recognition research. IEEE Transactions on pattern analysis and machine intelligence, 16(5):550–554, 1994.
- Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data. CoRR, abs/1811.11479, 2018a.
- Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data. ArXiv, abs/1811.11479, 2018b.
- Continual learning with node-importance based adaptive group sparse regularization. arXiv: Learning, 2020.
- Learning multiple layers of features from tiny images. 2009.
- Tiny imagenet visual recognition challenge. CS 231N, 7(7):3, 2015.
- Mnist handwritten digit database, 2010.
- A federated learning framework based on incremental weighting and diversity selection for internet of vehicles. Electronics, 11(22):3668, 2022.
- Federated multi-task learning for competing constraints. CoRR, abs/2012.04221, 2020a.
- Federated optimization in heterogeneous networks. Proceedings of Machine Learning and Systems, 2:429–450, 2020b.
- Federated learning with position-aware neurons. In IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2022, New Orleans, LA, USA, June 18-24, 2022, pages 10072–10081.
- Privacy-preserved federated learning for autonomous driving. IEEE Transactions on Intelligent Transportation Systems, 23(7):8423–8434, 2021.
- Ensemble distillation for robust model fusion in federated learning. Advances in Neural Information Processing Systems, 33:2351–2363, 2020.
- Edge-assisted hierarchical federated learning with non-iid data. ArXiv, abs/1905.06641, 2019.
- Generative feature replay for class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 226–227, 2020.
- Learning transferable features with deep adaptation networks. ArXiv, abs/1502.02791, 2015.
- Continual federated learning based on knowledge distillation. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, IJCAI-22, pages 2182–2188. International Joint Conferences on Artificial Intelligence Organization, 2022. Main Track.
- Continuous learning in single-incremental-task scenarios. Neural networks : the official journal of the International Neural Network Society, 116:56–73, 2018.
- Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282. PMLR, 2017.
- Distance-based image classification: Generalizing to new classes at near-zero cost. IEEE transactions on pattern analysis and machine intelligence, 35(11):2624–2637, 2013.
- An efficient domain-incremental learning approach to drive in all weather conditions. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3001–3011, 2022.
- Reading digits in natural images with unsupervised feature learning. 2011.
- Federated learning for smart healthcare: A survey. ACM Computing Surveys (CSUR), 55(3):1–37, 2022.
- Deep learning on a data diet: Finding important examples early in training. CoRR, abs/2107.07075, 2021.
- Moment matching for multi-source domain adaptation. In Proceedings of the IEEE/CVF international conference on computer vision, pages 1406–1415, 2019.
- Better generative replay for continual federated learning. arXiv preprint arXiv:2302.13001, 2023.
- icarl: Incremental classifier and representation learning. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 2001–2010, 2017.
- Sebastian Ruder. An overview of multi-task learning in deep neural networks. arXiv preprint arXiv:1706.05098, 2017.
- Adapting visual category models to new domains. In Computer Vision–ECCV 2010: 11th European Conference on Computer Vision, Heraklion, Crete, Greece, September 5-11, 2010, Proceedings, Part IV 11, pages 213–226. Springer, 2010.
- An empirical study of example forgetting during deep neural network learning. CoRR, abs/1812.05159, 2018.
- Gido M. van de Ven and Andreas Savas Tolias. Three scenarios for continual learning. ArXiv, abs/1904.07734, 2019.
- ATPFL: automatic trajectory prediction model design under federated learning framework. In IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2022, New Orleans, LA, USA, June 18-24, 2022, pages 6553–6562.
- Dafkd: Domain-aware federated knowledge distillation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 20412–20421, 2023.
- Federated learning for healthcare informatics. Journal of Healthcare Informatics Research, 5:1–19, 2021.
- Sola: Continual learning with second-order loss approximation. ArXiv, abs/2006.10974, 2020.
- Federated continual learning with weighted inter-client transfer. In International Conference on Machine Learning, pages 12073–12086. PMLR, 2021.
- Semantic drift compensation for class-incremental learning. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 6980–6989, 2020.
- Fine-tuning global model via data-free knowledge distillation for non-iid federated learning. In IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2022, New Orleans, LA, USA, June 18-24, 2022, pages 10164–10173.
- Data-free knowledge distillation for heterogeneous federated learning. In International Conference on Machine Learning, pages 12878–12889. PMLR, 2021.