Timezone: »

 
Subspace Inference for Bayesian Deep Learning
Polina Kirichenko · Pavel Izmailov · Andrew Wilson

Fri Jun 14 02:00 PM -- 02:10 PM (PDT) @

Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty. However, scaling Bayesian inference techniques to deep neural networks is challenging due to the high dimensionality of the parameter space. In this pa- per, we construct low-dimensional subspaces of parameter space that contain diverse sets of models, such as the first principal components of the stochastic gradient descent (SGD) trajectory. In these subspaces, we are able to apply elliptical slice sampling and variational inference, which struggle in the full parameter space. We show that Bayesian model averaging over the induced posterior in these subspaces produces high accurate predictions and well-calibrated predictive uncertainty for both regression and image classification.

Author Information

Polina Kirichenko (Cornell)
Pavel Izmailov (CORNELL UNIVERSITY)
Andrew Wilson (Cornell University)
Andrew Wilson

Andrew Gordon Wilson is faculty in the Courant Institute and Center for Data Science at NYU. His interests include probabilistic modelling, Gaussian processes, Bayesian statistics, physics inspired machine learning, and loss surfaces and generalization in deep learning. His webpage is https://cims.nyu.edu/~andrewgw.

More from the Same Authors