Distilled Replay: Overcoming Forgetting through Synthetic Samples (2103.15851v2)
Abstract: Replay strategies are Continual Learning techniques which mitigate catastrophic forgetting by keeping a buffer of patterns from previous experiences, which are interleaved with new data during training. The amount of patterns stored in the buffer is a critical parameter which largely influences the final performance and the memory footprint of the approach. This work introduces Distilled Replay, a novel replay strategy for Continual Learning which is able to mitigate forgetting by keeping a very small buffer (1 pattern per class) of highly informative samples. Distilled Replay builds the buffer through a distillation process which compresses a large dataset into a tiny set of informative examples. We show the effectiveness of our Distilled Replay against popular replay-based strategies on four Continual Learning benchmarks.
- Andrea Rosasco (3 papers)
- Antonio Carta (29 papers)
- Andrea Cossu (25 papers)
- Vincenzo Lomonaco (58 papers)
- Davide Bacciu (107 papers)