Skip to yearly menu bar Skip to main content


talk
in
Workshop: INNF+: Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models

Invited talk 4: Divergence Measures in Variational Inference and How to Choose Them

Cheng Zhang


Abstract:

Variational inference (VI) plays an essential role in approximate Bayesian inference due to its computational efficiency and broad applicability. Crucial to the performance of VI is the selection of the associated divergence measure, as VI approximates the intractable distribution by minimizing this divergence. In this talk, I will discuss variational inference with different divergence measures first. Then, I will present a new meta-learning algorithm to learn the divergence metric suited for the task of interest, automating the design of VI methods.

Chat is not available.