Skip to yearly menu bar Skip to main content


Poster

Reparameterized Importance Sampling for Robust Variational Bayesian Neural Networks

Yunfei Long · Zilin Tian · Liguo Zhang · Huosheng Xu


Abstract:

Mean-field variational inference (MFVI) methods provide computationally cheap approximations to the posterior of Bayesian Neural Networks (BNNs) when compared to alternatives like MCMC. However, applying MFVI to BNNs encounters limitations due to the Monte Carlo sampling problem. This problem stems from two main issues. \emph{First}, most samples do not accurately represent the most probable weights. \emph{Second}, random sampling from variational distributions introduces high variance in gradient estimates, which can hinder the optimization process, leading to slow convergence or even failure. In this paper, we introduce a novel sampling method called \emph{Reparameterized Importance Sampling} (RIS) to estimate the first moment in neural networks, reducing variance during feed-forward propagation. We begin by analyzing the generalized form of the optimal proposal distribution and presenting an inexpensive approximation. Next, we describe the sampling process from the proposal distribution as a transformation that combines exogenous randomness with the variational parameters. Our experimental results demonstrate the effectiveness of the proposed RIS method in three critical aspects: improved convergence, enhanced predictive performance, and successful uncertainty estimation for out-of-distribution data.

Live content is unavailable. Log in and register to view live content