Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Overcoming Catastrophic Forgetting in Incremental Few-Shot Learning by Finding Flat Minima (2111.01549v2)

Published 30 Oct 2021 in cs.LG and cs.CV

Abstract: This paper considers incremental few-shot learning, which requires a model to continually recognize new categories with only a few examples provided. Our study shows that existing methods severely suffer from catastrophic forgetting, a well-known problem in incremental learning, which is aggravated due to data scarcity and imbalance in the few-shot setting. Our analysis further suggests that to prevent catastrophic forgetting, actions need to be taken in the primitive stage -- the training of base classes instead of later few-shot learning sessions. Therefore, we propose to search for flat local minima of the base training objective function and then fine-tune the model parameters within the flat region on new tasks. In this way, the model can efficiently learn new classes while preserving the old ones. Comprehensive experimental results demonstrate that our approach outperforms all prior state-of-the-art methods and is very close to the approximate upper bound. The source code is available at https://github.com/moukamisama/F2M.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Guangyuan Shi (8 papers)
  2. Jiaxin Chen (55 papers)
  3. Wenlong Zhang (93 papers)
  4. Li-Ming Zhan (10 papers)
  5. Xiao-Ming Wu (91 papers)
Citations (130)

Summary

We haven't generated a summary for this paper yet.