FeTrIL++: Feature Translation for Exemplar-Free Class-Incremental Learning with Hill-Climbing (2403.07406v1)
Abstract: Exemplar-free class-incremental learning (EFCIL) poses significant challenges, primarily due to catastrophic forgetting, necessitating a delicate balance between stability and plasticity to accurately recognize both new and previous classes. Traditional EFCIL approaches typically skew towards either model plasticity through successive fine-tuning or stability by employing a fixed feature extractor beyond the initial incremental state. Building upon the foundational FeTrIL framework, our research extends into novel experimental domains to examine the efficacy of various oversampling techniques and dynamic optimization strategies across multiple challenging datasets and incremental settings. We specifically explore how oversampling impacts accuracy relative to feature availability and how different optimization methodologies, including dynamic recalibration and feature pool diversification, influence incremental learning outcomes. The results from these comprehensive experiments, conducted on CIFAR100, Tiny-ImageNet, and an ImageNet-Subset, under-score the superior performance of FeTrIL in balancing accuracy for both new and past classes against ten contemporary methods. Notably, our extensions reveal the nuanced impacts of oversampling and optimization on EFCIL, contributing to a more refined understanding of feature-space manipulation for class incremental learning. FeTrIL and its extended analysis in this paper FeTrIL++ pave the way for more adaptable and efficient EFCIL methodologies, promising significant improvements in handling catastrophic forgetting without the need for exemplars.
- Deesil: Deep-shallow incremental learning. TaskCV Workshop @ ECCV 2018., 2018.
- Constructive reconstruction from irregular sampling in multi-window spline-type spaces, pages 257–265. 2010.
- Lifelong machine learning with deep streaming linear discriminant analysis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 220–221, 2020.
- Online continual learning for embedded devices. arXiv preprint arXiv:2203.10681, 2022.
- Learning a unified classifier incrementally via rebalancing. In IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2019, Long Beach, CA, USA, June 16-20, 2019, pages 831–839, 2019.
- Impact of coverage preservation techniques on prolonging the network lifetime in traffic surveillance applications. In 2008 4th International Conference on Intelligent Computer Communication and Processing, pages 201–206, 2008.
- Balanced softmax cross-entropy for incremental learning. In International Conference on Artificial Neural Networks, pages 385–396. Springer, 2021.
- Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13):3521–3526, 2017.
- More classifiers, less forgetting: A generic multi-classifier paradigm for incremental learning. In European Conference on Computer Vision, pages 699–716. Springer, 2020.
- Class-incremental learning: survey and performance evaluation on image classification, 2021.
- Generalized goertzel algorithm for computing the natural frequencies of cantilever beams. Signal Processing, 96:45–50, 2014.
- Fetril: Feature translation for exemplar-free class-incremental learning. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 3911–3920, 2023.
- Gdumb: A simple approach that questions our progress in continual learning. In European Conference on Computer Vision, pages 524–540. Springer, 2020.
- A tinyml platform for on-device continual learning with quantized latent replays. IEEE Journal on Emerging and Selected Topics in Circuits and Systems, 11(4):789–802, 2021.
- icarl: Incremental classifier and representation learning. In Conference on Computer Vision and Pattern Recognition, CVPR, 2017.
- Dataset knowledge transfer for class-incremental learning without memory. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 483–492, 2022.
- Always be dreaming: A new approach for data-free class-incremental learning. arXiv preprint arXiv:2106.09701, 2021.
- Efficient feature transformations for discriminative and generative continual learning. CoRR, abs/2103.13558, 2021.
- Max Welling. Herding dynamical weights to learn. In Proceedings of the 26th Annual International Conference on Machine Learning, ICML 2009, Montreal, Quebec, Canada, June 14-18, 2009, pages 1121–1128, 2009.
- Large scale incremental learning. In IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2019, Long Beach, CA, USA, June 16-20, 2019, pages 374–382, 2019.
- Semantic drift compensation for class-incremental learning. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020, Seattle, WA, USA, June 13-19, 2020, pages 6980–6989. IEEE, 2020.
- Class-incremental learning via dual augmentation. Advances in Neural Information Processing Systems, 34, 2021.
- Prototype augmentation and self-supervision for incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5871–5880, 2021.
- Self-sustaining representation expansion for non-exemplar class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9296–9305, 2022.