Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling
EVCL: Elastic Variational Continual Learning with Weight Consolidation
Hunar Batra · Ronald Clark
Keywords: [ Bayesian Deep Learning ] [ catastrophic forgetting ] [ continual learning ] [ Bayesian Neural Networks ]
Continual learning aims to allow models to learn new tasks without forgetting what has been learned before. This work introduces Elastic Variational Continual Learning with Weight Consolidation (EVCL), a novel hybrid model that integrates the variational posterior approximation mechanism of Variational Continual Learning (VCL) with the regularization-based parameter-protection strategy of Elastic Weight Consolidation (EWC). By combining the strengths of both methods, EVCL effectively mitigates catastrophic forgetting and enables better capture of dependencies between model parameters and task-specific data. Evaluated on five discriminative tasks, EVCL consistently outperforms existing baselines in both domain-incremental and task-incremental learning scenarios for deep discriminative models.