Poster

Overcoming Catastrophic Forgetting by Bayesian Generative Regularization

PEI-HUNG Chen · Wei Wei · Cho-Jui Hsieh · Bo Dai

Keywords: [ Deep Learning ]

[ Abstract ]
[ Paper ] [ Visit Poster at Spot C5 in Virtual World ]
Thu 22 Jul 9 p.m. PDT — 11 p.m. PDT
 
Spotlight presentation: Applications (NLP) 5
Thu 22 Jul 8:30 p.m. PDT — 9 p.m. PDT

Abstract:

In this paper, we propose a new method to over-come catastrophic forgetting by adding generative regularization to Bayesian inference frame-work. Bayesian method provides a general frame-work for continual learning. We could further construct a generative regularization term for all given classification models by leveraging energy-based models and Langevin dynamic sampling to enrich the features learned in each task. By combining discriminative and generative loss together, we empirically show that the proposed method outperforms state-of-the-art methods on a variety of tasks, avoiding catastrophic forgetting in continual learning. In particular, the proposed method outperforms baseline methods over 15%on the Fashion-MNIST dataset and 10%on the CUB dataset.

Chat is not available.