Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Subset Selection in Machine Learning: From Theory to Applications

Continual Learning via Function-Space Variational Inference: A Unifying View

Tim G. J. Rudner · Freddie Bickford Smith · Qixuan Feng · Yee-Whye Teh · Yarin Gal


Abstract:

Continual learning is the process of developing new abilities while retaining existing ones. Sequential Bayesian inference is a natural framework for this, but applying it successfully to deep neural networks remains a challenge. We propose continual function-space variational inference (C-FSVI), in which the variational distribution over functions induced by stochastic model parameters is encouraged to match the variational distribution over functions induced by stochastic parameters inferred on previous tasks. Unlike approaches that explicitly penalize changes in the model parameters, function-space regularization allows parameters to vary widely during training, resulting in greater flexibility to fit new data. C-FSVI improves on existing approaches to function-space regularization by performing inference entirely in function space and without relying on carefully selected coreset points. We show that C-FSVI outperforms alternative methods based on parameter-space and function-space regularization on a range of tasks.