Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sub-network Discovery and Soft-masking for Continual Learning of Mixed Tasks (2310.09436v1)

Published 13 Oct 2023 in cs.CL, cs.AI, cs.LG, and cs.NE

Abstract: Continual learning (CL) has two main objectives: preventing catastrophic forgetting (CF) and encouraging knowledge transfer (KT). The existing literature mainly focused on overcoming CF. Some work has also been done on KT when the tasks are similar. To our knowledge, only one method has been proposed to learn a sequence of mixed tasks. However, these techniques still suffer from CF and/or limited KT. This paper proposes a new CL method to achieve both. It overcomes CF by isolating the knowledge of each task via discovering a subnetwork for it. A soft-masking mechanism is also proposed to preserve the previous knowledge and to enable the new task to leverage the past knowledge to achieve KT. Experiments using classification, generation, information extraction, and their mixture (i.e., heterogeneous tasks) show that the proposed method consistently outperforms strong baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Zixuan Ke (26 papers)
  2. Bing Liu (212 papers)
  3. Wenhan Xiong (47 papers)
  4. Asli Celikyilmaz (81 papers)
  5. Haoran Li (168 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.