Timezone: »

 
Poster
Variational Auto-Regressive Gaussian Processes for Continual Learning
Sanyam Kapoor · Theofanis Karaletsos · Thang Bui

Thu Jul 22 09:00 AM -- 11:00 AM (PDT) @ Virtual

Through sequential construction of posteriors on observing data online, Bayes’ theorem provides a natural framework for continual learning. We develop Variational Auto-Regressive Gaussian Processes (VAR-GPs), a principled posterior updating mechanism to solve sequential tasks in continual learning. By relying on sparse inducing point approximations for scalable posteriors, we propose a novel auto-regressive variational distribution which reveals two fruitful connections to existing results in Bayesian inference, expectation propagation and orthogonal inducing points. Mean predictive entropy estimates show VAR-GPs prevent catastrophic forgetting, which is empirically supported by strong performance on modern continual learning benchmarks against competitive baselines. A thorough ablation study demonstrates the efficacy of our modeling choices.

Author Information

Sanyam Kapoor (New York University)
Theofanis Karaletsos (Facebook)
Thang Bui (University of Sydney)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors