Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

How Relevant is Selective Memory Population in Lifelong Language Learning? (2210.00940v1)

Published 3 Oct 2022 in cs.CL, cs.AI, and cs.LG

Abstract: Lifelong language learning seeks to have models continuously learn multiple tasks in a sequential order without suffering from catastrophic forgetting. State-of-the-art approaches rely on sparse experience replay as the primary approach to prevent forgetting. Experience replay usually adopts sampling methods for the memory population; however, the effect of the chosen sampling strategy on model performance has not yet been studied. In this paper, we investigate how relevant the selective memory population is in the lifelong learning process of text classification and question-answering tasks. We found that methods that randomly store a uniform number of samples from the entire data stream lead to high performances, especially for low memory size, which is consistent with computer vision studies.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Vladimir Araujo (25 papers)
  2. Helena Balabin (2 papers)
  3. Julio Hurtado (17 papers)
  4. Alvaro Soto (34 papers)
  5. Marie-Francine Moens (102 papers)
Citations (7)