Task-Adaptive Saliency Guidance for Exemplar-free Class Incremental Learning (2212.08251v2)
Abstract: Exemplar-free Class Incremental Learning (EFCIL) aims to sequentially learn tasks with access only to data from the current one. EFCIL is of interest because it mitigates concerns about privacy and long-term storage of data, while at the same time alleviating the problem of catastrophic forgetting in incremental learning. In this work, we introduce task-adaptive saliency for EFCIL and propose a new framework, which we call Task-Adaptive Saliency Supervision (TASS), for mitigating the negative effects of saliency drift between different tasks. We first apply boundary-guided saliency to maintain task adaptivity and \textit{plasticity} on model attention. Besides, we introduce task-agnostic low-level signals as auxiliary supervision to increase the \textit{stability} of model attention. Finally, we introduce a module for injecting and recovering saliency noise to increase the robustness of saliency preservation. Our experiments demonstrate that our method can better preserve saliency maps across tasks and achieve state-of-the-art results on the CIFAR-100, Tiny-ImageNet, and ImageNet-Subset EFCIL benchmarks. Code is available at \url{https://github.com/scok30/tass}.
- A comprehensive study of class incremental learning algorithms for visual tasks. Neural Networks, 135:38–54, 2021.
- End-to-end incremental learning. In ECCV, 2018.
- Modeling the background for incremental learning in semantic segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9233–9242, 2020.
- Efficient lifelong learning with a-gem. In ICLR, 2019.
- Data-free learning of student networks. In ICCV, 2019.
- A highly efficient model to study the semantics of salient object detection. IEEE TPAMI, 2021.
- A continual learning survey: Defying forgetting in classification tasks. IEEE TPAMI, 2021.
- Imagenet: A large-scale hierarchical image database. In CVPR, 2009.
- Learning without memorizing. In CVPR, 2019.
- Podnet: Pooled outputs distillation for small-tasks incremental learning. In ECCV, 2020.
- Remembering for the right reasons: Explanations reduce catastrophic forgetting. In ICLR, 2021.
- Advances in deep concealed scene understanding. Visual Intelligence, 1(1):16, 2023.
- Overcoming catastrophic forgetting in incremental object detection via elastic response distillation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9427–9436, 2022.
- An empirical investigation of catastrophic forgetting in gradient-based neural networks. arXiv preprint arXiv:1312.6211, 2013.
- Deep residual learning for image recognition. In CVPR, 2016.
- Learning a unified classifier incrementally via rebalancing. In CVPR, 2019.
- Improving deep learning interpretability by saliency guided training. NeurIPS, 2021.
- Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 2017.
- Learning multiple layers of features from tiny images. Technical report, 2009.
- Tiny imagenet visual recognition challenge. CS 231N, 7(7):3, 2015.
- Boosting low-data instance segmentation by unsupervised pre-training with saliency prompt. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 15485–15494, 2023.
- Semantic flow for fast and accurate scene parsing. In ECCV, 2020.
- The secrets of salient object segmentation. In CVPR, 2014.
- Learning without forgetting. In ECCV, 2016.
- On learning the right attention point for feature enhancement. Science China Information Sciences, 66(1):112107, 2023a.
- Sequential interactive image segmentation. Computational Visual Media, 9(4):753–765, 2023b.
- A simple pooling-based design for real-time salient object detection. In CVPR, 2019.
- Dynamic feature integration for simultaneous detection of salient object, edge and skeleton. IEEE TIP, 2020a.
- Rotate your networks: Better weight consolidation and less catastrophic forgetting. In ICPR, 2018.
- Long-tailed class incremental learning. In European Conference on Computer Vision, pages 495–512. Springer, 2022.
- More classifiers, less forgetting: A generic multi-classifier paradigm for incremental learning. In ECCV. Springer, 2020b.
- Augmented box replay: Overcoming foreground shift for incremental object detection. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 11367–11377, 2023.
- Gradient episodic memory for continual learning. NeurIPS, 2017.
- Self-attention guidance based crowd localization and counting. Machine Intelligence Research, pages 1–17, 2024.
- Packnet: Adding multiple tasks to a single network by iterative pruning. In CVPR, 2018.
- Class-incremental learning: survey and performance evaluation on image classification. IEEE TPAMI, 2022.
- Catastrophic interference in connectionist networks: The sequential learning problem. In Psychology of learning and motivation, pages 109–165. Elsevier, 1989.
- Fetril: Feature translation for exemplar-free class-incremental learning. In WACV, 2023.
- Dualnet: Continual learning, fast and slow. NeurIPS, 2021.
- icarl: Incremental classifier and representation learning. In CVPR, 2017.
- Learning to learn without forgetting by maximizing transfer and minimizing interference. In ICLR, 2019.
- Grad-cam: Visual explanations from deep networks via gradient-based localization. In ICCV, 2017.
- Prototype reminiscence and augmented asymmetric knowledge aggregation for non-exemplar class-incremental learning. In ICCV, 2023.
- Always be dreaming: A new approach for data-free class-incremental learning. In ICCV, 2021.
- Large scale incremental learning. In CVPR, 2019.
- Endpoints weight fusion for class incremental semantic segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 7204–7213, 2023.
- Reinforced continual learning. NeurIPS, 2018.
- Hierarchical saliency detection. In CVPR, 2013.
- Saliency detection via graph-based manifold ranking. In CVPR, 2013.
- Dreaming to distill: Data-free knowledge transfer via deepinversion. In CVPR, 2020.
- Semantic drift compensation for class-incremental learning. In CVPR, 2020.
- Self-training for class-incremental semantic segmentation. IEEE Transactions on Neural Networks and Learning Systems, 2022.
- Masked autoencoders are efficient class incremental learners. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 19104–19113, 2023.
- Class-incremental learning via deep model consolidation. In WACV, 2020.
- Egnet: Edge guidance network for salient object detection. In ICCV, 2019.
- Class-incremental learning via dual augmentation. NeurIPS, 2021a.
- Prototype augmentation and self-supervision for incremental learning. In CVPR, 2021b.
- Self-sustaining representation expansion for non-exemplar class-incremental learning. In CVPR, 2022.
- Self-organizing pathway expansion for non-exemplar class-incremental learning. In ICCV, 2023.