Poster
Noisy Natural Gradient as Variational Inference
Guodong Zhang · Shengyang Sun · David Duvenaud · Roger Grosse

Thu Jul 12th 06:15 -- 09:00 PM @ Hall B #198

Variational Bayesian neural nets combine the flexibility of deep learning with Bayesian uncertainty estimation. Unfortunately, there is a tradeoff between cheap but simple variational families (e.g.~fully factorized) or expensive and complicated inference procedures. We show that natural gradient ascent with adaptive weight noise implicitly fits a variational posterior to maximize the evidence lower bound (ELBO). This insight allows us to train full-covariance, fully factorized, or matrix-variate Gaussian variational posteriors using noisy versions of natural gradient, Adam, and K-FAC, respectively, making it possible to scale up to modern-size ConvNets. On standard regression benchmarks, our noisy K-FAC algorithm makes better predictions and matches Hamiltonian Monte Carlo's predictive variances better than existing methods. Its improved uncertainty estimates lead to more efficient exploration in active learning, and intrinsic motivation for reinforcement learning.

Author Information

Guodong Zhang (University of Toronto)
Shengyang Sun (University of Toronto)
David Duvenaud (University of Toronto)
Roger Grosse (University of Toronto and Vector Institute)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors