Timezone: »
We develop a method to combine Markov chain Monte Carlo (MCMC) and variational inference (VI), leveraging the advantages of both inference approaches. Specifically, we improve the variational distribution by running a few MCMC steps. To make inference tractable, we introduce the variational contrastive divergence (VCD), a new divergence that replaces the standard Kullback-Leibler (KL) divergence used in VI. The VCD captures a notion of discrepancy between the initial variational distribution and its improved version (obtained after running the MCMC steps), and it converges asymptotically to the symmetrized KL divergence between the variational distribution and the posterior of interest. The VCD objective can be optimized efficiently with respect to the variational parameters via stochastic optimization. We show experimentally that optimizing the VCD leads to better predictive performance on two latent variable models: logistic matrix factorization and variational autoencoders (VAEs).
Author Information
Francisco Ruiz (University of Cambridge / Columbia University)
Michalis Titsias (DeepMind)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: A Contrastive Divergence for Combining Variational Inference and MCMC »
Wed. Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom #210
More from the Same Authors
-
2018 Poster: Augment and Reduce: Stochastic Inference for Large Categorical Distributions »
Francisco Ruiz · Michalis Titsias · Adji Bousso Dieng · David Blei -
2018 Oral: Augment and Reduce: Stochastic Inference for Large Categorical Distributions »
Francisco Ruiz · Michalis Titsias · Adji Bousso Dieng · David Blei