Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Association: Remind Your GAN not to Forget (2011.13553v2)

Published 27 Nov 2020 in cs.CV

Abstract: Neural networks are susceptible to catastrophic forgetting. They fail to preserve previously acquired knowledge when adapting to new tasks. Inspired by human associative memory system, we propose a brain-like approach that imitates the associative learning process to achieve continual learning. We design a heuristics mechanism to potentiatively stimulate the model, which guides the model to recall the historical episodes based on the current circumstance and obtained association experience. Besides, a distillation measure is added to depressively alter the efficacy of synaptic transmission, which dampens the feature reconstruction learning for new task. The framework is mediated by potentiation and depression stimulation that play opposing roles in directing synaptic and behavioral plasticity. It requires no access to the original data and is more similar to human cognitive process. Experiments demonstrate the effectiveness of our method in alleviating catastrophic forgetting on image-to-image translation tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Yi Gu (69 papers)
  2. Jie Li (553 papers)
  3. Yuting Gao (25 papers)
  4. Ruoxin Chen (9 papers)
  5. Chentao Wu (15 papers)
  6. Feiyang Cai (10 papers)
  7. Chao Wang (555 papers)
  8. Zirui Zhang (17 papers)

Summary

We haven't generated a summary for this paper yet.