Timezone: »

 
Oral
Fast and Simple Natural-Gradient Variational Inference with Mixture of Exponential-family Approximations
Wu Lin · Mohammad Emtiyaz Khan · Mark Schmidt

Tue Jun 11 12:05 PM -- 12:10 PM (PDT) @ Room 101

Natural-gradient methods enable fast and simple algorithms for variational inference, but due to computational difficulties, their use is mostly limited to minimal exponential-family (EF) approximations. In this paper, we extend the application of natural-gradient methods to estimate structured approximations such as mixture of EF distribution. Such approximations can fit complex, multimodal posterior distributions and are generally more accurate than unimodal EF approximations. By using a minimal conditional-EF representation of such approximations, we derive simple natural-gradient updates. Our empirical results demonstrate a faster convergence of our natural-gradient method compared to black-box gradient-based methods. Our work expands the scope of natural gradients for Bayesian inference and makes them more widely applicable than before.

Author Information

Wu Lin (University of British Columbia)
Mohammad Emtiyaz Khan (RIKEN)
Mark Schmidt (University of British Columbia)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors