This website requires JavaScript.

Saliency-Augmented Memory Completion for Continual Learning

Guangji BaiChen LingYuyang GaoLiang Zhao
Dec 2022
Continual Learning is considered a key step toward next-generation ArtificialIntelligence. Among various methods, replay-based approaches that maintain andreplay a small episodic memory of previous samples are one of the mostsuccessful strategies against catastrophic forgetting. However, sinceforgetting is inevitable given bounded memory and unbounded tasks, how toforget is a problem continual learning must address. Therefore, beyond simplyavoiding catastrophic forgetting, an under-explored issue is how to reasonablyforget while ensuring the merits of human memory, including 1. storageefficiency, 2. generalizability, and 3. some interpretability. To achieve thesesimultaneously, our paper proposes a new saliency-augmented memory completionframework for continual learning, inspired by recent discoveries in memorycompletion separation in cognitive neuroscience. Specifically, we innovativelypropose to store the part of the image most important to the tasks in episodicmemory by saliency map extraction and memory encoding. When learning new tasks,previous data from memory are inpainted by an adaptive data generation module,which is inspired by how humans complete episodic memory. The module'sparameters are shared across all tasks and it can be jointly trained with acontinual learning classifier as bilevel optimization. Extensive experiments onseveral continual learning and image classification benchmarks demonstrate theproposed method's effectiveness and efficiency.