Papers
Topics
Authors
Recent
Search
2000 character limit reached

TLCE: Transfer-Learning Based Classifier Ensembles for Few-Shot Class-Incremental Learning

Published 7 Dec 2023 in cs.CV | (2312.04225v1)

Abstract: Few-shot class-incremental learning (FSCIL) struggles to incrementally recognize novel classes from few examples without catastrophic forgetting of old classes or overfitting to new classes. We propose TLCE, which ensembles multiple pre-trained models to improve separation of novel and old classes. TLCE minimizes interference between old and new classes by mapping old class images to quasi-orthogonal prototypes using episodic training. It then ensembles diverse pre-trained models to better adapt to novel classes despite data imbalance. Extensive experiments on various datasets demonstrate that our transfer learning ensemble approach outperforms state-of-the-art FSCIL methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (46)
  1. Constrained few-shot class-incremental learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2022, pp. 9057–9067.
  2. Self-promoted prototype refinement for few-shot class-incremental learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2021, pp. 6801–6810.
  3. Few-shot incremental learning with continually evolved classifiers, in: IEEE Conf. Comput. Vis. Pattern Recog., 2021, pp. 12455–12464.
  4. Overcoming catastrophic forgetting in incremental few-shot learning by finding flat minima, Adv. Neural Inform. Process. Syst. 34 (2021) 6747–6761.
  5. A closer look at few-shot classification, in: Int. Conf. Learn. Represent., 2019.
  6. Rethinking few-shot image classification: a good embedding is all you need?, in: Eur. Conf. Comput. Vis., Springer, 2020, pp. 266–282.
  7. Imagenet large scale visual recognition challenge, Int. J. Comput. Vis. 115 (2015) 211–252.
  8. Learning multiple layers of features from tiny images (2009).
  9. Meta-learning in neural networks: A survey, IEEE Trans. Pattern Anal. Mach. Intell. 44 (2021) 5149–5169.
  10. Model-agnostic meta-learning for fast adaptation of deep networks, in: Int. Conf. Mach. Learn., PMLR, 2017, pp. 1126–1135.
  11. M. A. Jamal, G.-J. Qi, Task agnostic meta-learning for few-shot learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2019.
  12. Meta-learning with latent embedding optimization, in: Int. Conf. Learn. Represent., 2019.
  13. Siamese neural networks for one-shot image recognition, in: ICML deep learning workshop, 2015.
  14. Matching networks for one shot learning, Adv. Neural Inform. Process. Syst. 29 (2016).
  15. Prototypical networks for few-shot learning, Adv. Neural Inform. Process. Syst. 30 (2017).
  16. Learning to compare: Relation network for few-shot learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2018, pp. 1199–1208.
  17. Free lunch for few-shot learning: Distribution calibration, in: Int. Conf. Learn. Represent., 2021.
  18. Learning calibrated class centers for few-shot classification by pair-wise similarity, IEEE Trans. Image Process. 31 (2022) 4543–4555.
  19. Alleviating the sample selection bias in few-shot learning by removing projection to the centroid, in: Adv. Neural Inform. Process. Syst., 2022.
  20. P3dc-shot: Prior-driven discrete data calibration for nearest-neighbor few-shot classification, Image and Vision Computing (2023) 104736.
  21. Memory-efficient incremental learning through feature adaptation, in: Eur. Conf. Comput. Vis., Springer, 2020, pp. 699–715.
  22. Prototype augmentation and self-supervision for incremental learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2021, pp. 5871–5880.
  23. Fetril: Feature translation for exemplar-free class-incremental learning, in: IEEE Winter Conf. Appl. Comput. Vis., 2023, pp. 3911–3920.
  24. Overcoming catastrophic forgetting in neural networks, Proc. Natl. Acad. Sci. 114 (2017) 3521–3526.
  25. Riemannian walk for incremental learning: Understanding forgetting and intransigence, in: Eur. Conf. Comput. Vis., 2018, pp. 532–547.
  26. Continual learning with extended kronecker-factored approximate curvature, in: IEEE Conf. Comput. Vis. Pattern Recog., 2020, pp. 9001–9010.
  27. icarl: Incremental classifier and representation learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2017, pp. 2001–2010.
  28. End-to-end incremental learning, in: Eur. Conf. Comput. Vis., 2018.
  29. R-dfcil: Relation-guided representation learning for data-free class incremental learning, in: Eur. Conf. Comput. Vis., Springer, 2022, pp. 423–439.
  30. Semantic drift compensation for class-incremental learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2020, pp. 6982–6991.
  31. Adaptive aggregation networks for class-incremental learning, in: Eur. Conf. Comput. Vis., 2021, pp. 2544–2553.
  32. Few-shot class-incremental learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2020, pp. 12180–12189. doi:10.1109/CVPR42600.2020.01220.
  33. Semantic-aware knowledge distillation for few-shot class-incremental learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2021, pp. 2534–2543.
  34. Few-shot class-incremental learning via relation knowledge distillation, in: AAAI, volume 35, 2021, pp. 1255–1263.
  35. Forward compatible few-shot class-incremental learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2022, pp. 9046–9056.
  36. Few-shot class-incremental learning from an open-set perspective, in: Eur. Conf. Comput. Vis., Springer, 2022, pp. 382–397.
  37. Learning with fantasy: Semantic-aware virtual contrastive constraint for few-shot class-incremental learning, IEEE Conf. Comput. Vis. Pattern Recog. (2023).
  38. Warping the space: Weight space rotation for class-incremental few-shot learning, in: Int. Conf. Learn. Represent., 2023.
  39. Few-shot lifelong learning, in: AAAI, volume 35, 2021, pp. 2337–2345.
  40. Metafscil: a meta-learning approach for few-shot class incremental learning, in: IEEE Conf. Comput. Vis. Pattern Recog., 2022, pp. 14166–14175.
  41. Memorizing complementation network for few-shot class-incremental learning, IEEE Trans. Image Process. (2023).
  42. Flexible few-shot class-incremental learning with prototype container, Neural. Comput. Appl. 35 (2023) 10875–10889.
  43. Robust high-dimensional memory-augmented neural networks, Nat. Commun. 12 (2021) 2468.
  44. Continual learning in deep networks: an analysis of the last layer, arXiv preprint arXiv:2106.01834 (2021).
  45. Simpleshot: Revisiting nearest-neighbor classification for few-shot learning, arXiv preprint arXiv:1911.04623 (2019).
  46. Learning a unified classifier incrementally via rebalancing, in: IEEE Conf. Comput. Vis. Pattern Recog., 2019, pp. 831–839.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.