DS-AL: A Dual-Stream Analytic Learning for Exemplar-Free Class-Incremental Learning (2403.17503v1)
Abstract: Class-incremental learning (CIL) under an exemplar-free constraint has presented a significant challenge. Existing methods adhering to this constraint are prone to catastrophic forgetting, far more so than replay-based techniques that retain access to past samples. In this paper, to solve the exemplar-free CIL problem, we propose a Dual-Stream Analytic Learning (DS-AL) approach. The DS-AL contains a main stream offering an analytical (i.e., closed-form) linear solution, and a compensation stream improving the inherent under-fitting limitation due to adopting linear mapping. The main stream redefines the CIL problem into a Concatenated Recursive Least Squares (C-RLS) task, allowing an equivalence between the CIL and its joint-learning counterpart. The compensation stream is governed by a Dual-Activation Compensation (DAC) module. This module re-activates the embedding with a different activation function from the main stream one, and seeks fitting compensation by projecting the embedding to the null space of the main stream's linear mapping. Empirical results demonstrate that the DS-AL, despite being an exemplar-free technique, delivers performance comparable with or better than that of replay-based methods across various datasets, including CIFAR-100, ImageNet-100 and ImageNet-Full. Additionally, the C-RLS' equivalent property allows the DS-AL to execute CIL in a phase-invariant manner. This is evidenced by a never-before-seen 500-phase CIL ImageNet task, which performs on a level identical to a 5-phase one. Our codes are available at https://github.com/ZHUANGHP/Analytic-continual-learning.
- A comprehensive study of class incremental learning algorithms for visual tasks. Neural Networks, 135: 38–54.
- End-to-End Incremental Learning. In Proceedings of the European Conference on Computer Vision (ECCV).
- Podnet: Pooled outputs distillation for small-tasks incremental learning. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XX 16, 86–102. Springer.
- Pseudoinverse learning algorithm for feedforward neural networks. Advances in Neural Networks and Applications, 321–326.
- Hayes, M. H. 1996. Statistical digital signal processing and modeling. John Wiley & Sons.
- Learning a Unified Classifier Incrementally via Rebalancing. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
- Less-forgetting learning in deep neural networks. arXiv preprint arXiv:1607.00122.
- Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13): 3521–3526.
- Learning without Forgetting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40(12): 2935–2947.
- Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting. In 2018 24th International Conference on Pattern Recognition (ICPR), 2262–2268.
- Online Hyperparameter Optimization for Class-Incremental Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 37(7): 8906–8913.
- Adaptive Aggregation Networks for Class-Incremental Learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2544–2553.
- RMM: Reinforced Memory Management for Class-Incremental Learning. Advances in Neural Information Processing Systems, 34.
- Mnemonics Training: Multi-Class Incremental Learning Without Forgetting. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
- Progressive Voronoi Diagram Subdivision Enables Accurate Data-free Class-Incremental Learning. In The Eleventh International Conference on Learning Representations.
- FeTrIL: Feature Translation for Exemplar-Free Class-Incremental Learning. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 3911–3920.
- iCaRL: Incremental Classifier and Representation Learning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
- FOSTER: Feature Boosting and Compression for Class-Incremental Learning. In Avidan, S.; Brostow, G.; Cissé, M.; Farinella, G. M.; and Hassner, T., eds., Computer Vision – ECCV 2022, 398–414. Cham: Springer Nature Switzerland. ISBN 978-3-031-19806-9.
- Large Scale Incremental Learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
- Class-Incremental Learning via Dual Augmentation. In Ranzato, M.; Beygelzimer, A.; Dauphin, Y.; Liang, P.; and Vaughan, J. W., eds., Advances in Neural Information Processing Systems, volume 34, 14306–14318. Curran Associates, Inc.
- Prototype Augmentation and Self-Supervision for Incremental Learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 5871–5880.
- Self-Sustaining Representation Expansion for Non-Exemplar Class-Incremental Learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 9296–9305.
- Blockwise Recursive Moore-Penrose Inverse for Network Learning. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 1–14.
- GKEAL: Gaussian Kernel Embedded Analytic Learning for Few-Shot Class Incremental Task. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 7746–7755.
- ACIL: Analytic Class-Incremental Learning with Absolute Memorization and Privacy Protection. In Koyejo, S.; Mohamed, S.; Agarwal, A.; Belgrave, D.; Cho, K.; and Oh, A., eds., Advances in Neural Information Processing Systems, volume 35, 11602–11614. Curran Associates, Inc.