Skip to yearly menu bar Skip to main content


Poster

Variational Auto-Regressive Gaussian Processes for Continual Learning

Sanyam Kapoor · Theofanis Karaletsos · Thang Bui

Virtual

Keywords: [ Gaussian Processes and Bayesian non-parametrics ]


Abstract:

Through sequential construction of posteriors on observing data online, Bayes’ theorem provides a natural framework for continual learning. We develop Variational Auto-Regressive Gaussian Processes (VAR-GPs), a principled posterior updating mechanism to solve sequential tasks in continual learning. By relying on sparse inducing point approximations for scalable posteriors, we propose a novel auto-regressive variational distribution which reveals two fruitful connections to existing results in Bayesian inference, expectation propagation and orthogonal inducing points. Mean predictive entropy estimates show VAR-GPs prevent catastrophic forgetting, which is empirically supported by strong performance on modern continual learning benchmarks against competitive baselines. A thorough ablation study demonstrates the efficacy of our modeling choices.

Chat is not available.