FeTrIL: Feature Translation for Exemplar-Free Class-Incremental Learning (2211.13131v2)
Abstract: Exemplar-free class-incremental learning is very challenging due to the negative effect of catastrophic forgetting. A balance between stability and plasticity of the incremental process is needed in order to obtain good accuracy for past as well as new classes. Existing exemplar-free class-incremental methods focus either on successive fine tuning of the model, thus favoring plasticity, or on using a feature extractor fixed after the initial incremental state, thus favoring stability. We introduce a method which combines a fixed feature extractor and a pseudo-features generator to improve the stability-plasticity balance. The generator uses a simple yet effective geometric translation of new class features to create representations of past classes, made of pseudo-features. The translation of features only requires the storage of the centroid representations of past classes to produce their pseudo-features. Actual features of new classes and pseudo-features of past classes are fed into a linear classifier which is trained incrementally to discriminate between all classes. The incremental process is much faster with the proposed method compared to mainstream ones which update the entire deep model. Experiments are performed with three challenging datasets, and different incremental settings. A comparison with ten existing methods shows that our method outperforms the others in most cases.
- Deesil: Deep-shallow incremental learning. TaskCV Workshop @ ECCV 2018., 2018.
- A comprehensive study of class incremental learning algorithms for visual tasks. Neural Networks, 135:38–54, 2021.
- The tradeoffs of large scale learning. Advances in neural information processing systems, 20, 2007.
- End-to-end incremental learning. In Computer Vision - ECCV 2018 - 15th European Conference, Munich, Germany, September 8-14, 2018, Proceedings, Part XII, pages 241–257, 2018.
- A two-stage approach to few-shot learning for image recognition. IEEE Transactions on Image Processing, 29:3336–3350, 2019.
- Self-supervised features improve open-world learning. arXiv preprint arXiv:2102.07848, 2021.
- Podnet: Pooled outputs distillation for small-tasks incremental learning. In Computer vision-ECCV 2020-16th European conference, Glasgow, UK, August 23-28, 2020, Proceedings, Part XX, volume 12365, pages 86–102. Springer, 2020.
- Deep learning. MIT press, 2016.
- Lifelong machine learning with deep streaming linear discriminant analysis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 220–221, 2020.
- Online continual learning for embedded devices. arXiv preprint arXiv:2203.10681, 2022.
- Exemplar-supported generative reproduction for class incremental learning. In British Machine Vision Conference 2018, BMVC 2018, Northumbria University, Newcastle, UK, September 3-6, 2018, page 98, 2018.
- Deep residual learning for image recognition. In Conference on Computer Vision and Pattern Recognition, CVPR, 2016.
- Distilling the knowledge in a neural network. CoRR, abs/1503.02531, 2015.
- Learning a unified classifier incrementally via rebalancing. In IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2019, Long Beach, CA, USA, June 16-20, 2019, pages 831–839, 2019.
- Measuring catastrophic forgetting in neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 32, 2018.
- Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13):3521–3526, 2017.
- Alex Krizhevsky. Learning multiple layers of features from tiny images. Technical report, University of Toronto, 2009.
- Continual learning: A comparative study on how to defy forgetting in classification tasks. CoRR, abs/1909.08383, 2019.
- Tiny imagenet visual recognition challenge. CS 231N, 7(7):3, 2015.
- Learning without forgetting. In European Conference on Computer Vision, ECCV, 2016.
- More classifiers, less forgetting: A generic multi-classifier paradigm for incremental learning. In European Conference on Computer Vision, pages 699–716. Springer, 2020.
- Adaptive aggregation networks for class-incremental learning. In Conference on Computer Vision and Pattern Recognition, CVPR, 2021.
- Mnemonics training: Multi-class incremental learning without forgetting. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020, Seattle, WA, USA, June 13-19, 2020, pages 12242–12251. IEEE, 2020.
- Mnemonics training: Multi-class incremental learning without forgetting. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 06 2020.
- Class-incremental learning: survey and performance evaluation on image classification, 2021.
- Catastrophic interference in connectionist networks: The sequential learning problem. The Psychology of Learning and Motivation, 24:104–169, 1989.
- Distance-based image classification: Generalizing to new classes at near-zero cost. IEEE transactions on pattern analysis and machine intelligence, 35(11):2624–2637, 2013.
- The stability-plasticity dilemma: investigating the continuum from catastrophic forgetting to age-limited learning effects. Frontiers in Psychology, 4:504–504, 2013.
- What is being transferred in transfer learning? arXiv preprint arXiv:2008.11687, 2020.
- Continual lifelong learning with neural networks: A review. Neural Networks, 113, 2019.
- Scikit-learn: Machine learning in python. CoRR, abs/1201.0490, 2012.
- Gdumb: A simple approach that questions our progress in continual learning. In European Conference on Computer Vision, pages 524–540. Springer, 2020.
- A tinyml platform for on-device continual learning with quantized latent replays. IEEE Journal on Emerging and Selected Topics in Circuits and Systems, 11(4):789–802, 2021.
- icarl: Incremental classifier and representation learning. In Conference on Computer Vision and Pattern Recognition, CVPR, 2017.
- The extreme value machine. IEEE transactions on pattern analysis and machine intelligence, 40(3):762–768, 2017.
- Imagenet large scale visual recognition challenge. International Journal of Computer Vision, 115(3):211–252, 2015.
- A case study of incremental concept induction. In AAAI, volume 86, pages 496–501, 1986.
- Cnn features off-the-shelf: an astounding baseline for recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pages 806–813, 2014.
- Always be dreaming: A new approach for data-free class-incremental learning. arXiv preprint arXiv:2106.09701, 2021.
- A survey on deep transfer learning. In International conference on artificial neural networks, pages 270–279. Springer, 2018.
- Gido M Van de Ven and Andreas S Tolias. Three scenarios for continual learning. arXiv preprint arXiv:1904.07734, 2019.
- A strategy for an uncompromising incremental learner. arXiv preprint arXiv:1705.00744, 2017.
- Efficient feature transformations for discriminative and generative continual learning. CoRR, abs/2103.13558, 2021.
- Max Welling. Herding dynamical weights to learn. In Proceedings of the 26th Annual International Conference on Machine Learning, ICML 2009, Montreal, Quebec, Canada, June 14-18, 2009, pages 1121–1128, 2009.
- Large scale incremental learning. In IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2019, Long Beach, CA, USA, June 16-20, 2019, pages 374–382, 2019.
- Semantic drift compensation for class-incremental learning. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020, Seattle, WA, USA, June 13-19, 2020, pages 6980–6989. IEEE, 2020.
- Maintaining discrimination and fairness in class incremental learning. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020, Seattle, WA, USA, June 13-19, 2020, pages 13205–13214. IEEE, 2020.
- Class-incremental learning via dual augmentation. Advances in Neural Information Processing Systems, 34, 2021.
- Prototype augmentation and self-supervision for incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5871–5880, 2021.
- Self-sustaining representation expansion for non-exemplar class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9296–9305, 2022.
- Grégoire Petit (5 papers)
- Adrian Popescu (39 papers)
- Hugo Schindler (1 paper)
- David Picard (44 papers)
- Bertrand Delezoide (5 papers)