Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Architecture Matters in Continual Learning (2202.00275v1)

Published 1 Feb 2022 in cs.LG and cs.AI

Abstract: A large body of research in continual learning is devoted to overcoming the catastrophic forgetting of neural networks by designing new algorithms that are robust to the distribution shifts. However, the majority of these works are strictly focused on the "algorithmic" part of continual learning for a "fixed neural network architecture", and the implications of using different architectures are mostly neglected. Even the few existing continual learning methods that modify the model assume a fixed architecture and aim to develop an algorithm that efficiently uses the model throughout the learning experience. However, in this work, we show that the choice of architecture can significantly impact the continual learning performance, and different architectures lead to different trade-offs between the ability to remember previous tasks and learning new ones. Moreover, we study the impact of various architectural decisions, and our findings entail best practices and recommendations that can improve the continual learning performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Seyed Iman Mirzadeh (6 papers)
  2. Arslan Chaudhry (15 papers)
  3. Dong Yin (36 papers)
  4. Timothy Nguyen (16 papers)
  5. Razvan Pascanu (138 papers)
  6. Dilan Gorur (10 papers)
  7. Mehrdad Farajtabar (56 papers)
Citations (53)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com