Skip to yearly menu bar Skip to main content


Talk
in
Affinity Workshop: Women in Machine Learning (WiML) Un-Workshop

Invited Talk #1 - Evaluating approximate inference for BNNs

Yingzhen Li


Abstract:

Bayesian Neural Network is one of the major approaches for obtaining uncertainty estimates for deep learning models. Key to the success is the selection of the approximate inference algorithms used to compute the approximate posterior, with mean-field variational inference (MFVI) and MC-dropout being the most popular variants. But is the good downstream uncertainty estimation performance of BNNs attributed to good approximate inference? In this talk I will discuss some of our recent results towards answer this question. I will also discuss briefly the computational reasons of the preference of MFVI/MC-dropout and describe our latest work to make BNNs more memory efficient.