Skip to yearly menu bar Skip to main content


Contributed talk
in
Workshop: Uncertainty and Robustness in Deep Learning

Quality of Uncertainty Quantification for Bayesian Neural Network Inference

Jiayu Yao

[ ]
[ Video
2019 Contributed talk

Abstract:

Bayesian Neural Networks (BNNs) place priors over the parameters in a neural network. Inference in BNNs, however, is difficult; all inference methods for BNNs are approximate. In this work, we empirically compare the quality of predictive uncertainty estimates for 10 common inference methods on both regression and classification tasks. Our experiments demonstrate that commonly used metrics (e.g. test log-likelihood) can be misleading. Our experiments also indicate that inference innovations designed to capture structure in the posterior do not necessarily produce high quality posterior approximations.

Chat is not available.