Skip to yearly menu bar Skip to main content


Talk

Variational Boosting: Iteratively Refining Posterior Approximations

Andrew Miller · Nicholas J Foti · Ryan P. Adams

C4.9& C4.10

Abstract:

We propose a black-box variational inference method to approximate intractable distributions with an increasingly rich approximating class. Our method, variational boosting, iteratively refines an existing variational approximation by solving a sequence of optimization problems, allowing a trade-off between computation time and accuracy. We expand the variational approximating class by incorporating additional covariance structure and by introducing new components to form a mixture. We apply variational boosting to synthetic and real statistical models, and show that the resulting posterior inferences compare favorably to existing variational algorithms.

Live content is unavailable. Log in and register to view live content