Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Denoising Diffusion Variational Inference: Diffusion Models as Expressive Variational Posteriors

Wasu Top Piriyakulkij · Yingheng Wang · Volodymyr Kuleshov

Keywords: [ Latent Variable models ] [ Diffusion Models ] [ generative modeling ] [ Approximate Inference ] [ visualization ]


Abstract:

We propose denoising diffusion variational inference (DDVI), a black-box variational inference algorithm for latent variable models which relies on diffusion models as flexible approximate posteriors. Specifically, our method introduces an expressive class of diffusion-based variational posteriors that perform iterative refinement in latent space; we train these posteriors with a novel regularized evidence lower bound (ELBO) on the marginal likelihood inspired by the wake-sleep algorithm. Our method is easy to implement (it fits a regularized extension of the ELBO), is compatible with black-box variational inference, and outperforms alternative classes of approximate posteriors based on normalizing flows or adversarial networks. We find that DDVI improves inference and learning in deep latent variable models across common benchmarks as well as on a motivating task in biology---inferring latent ancestry from human genomes---where it outperforms strong baselines on the Thousand Genomes dataset.

Chat is not available.