Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Practical and Asymptotically Exact Conditional Sampling in Diffusion Models

Brian Trippe · Luhuan Wu · Christian Naesseth · David Blei · John Cunningham

Keywords: [ Protein Design ] [ Sequential Monte Carlo ] [ Diffusion Model ] [ Theoretical Guarantees ]


Abstract:

Diffusion models have been successful on a range of conditional generation tasks including molecular design and text-to-image generation.However, these achievements have primarily depended on expensive, task-specific conditional training or error-prone heuristic approximations to them.Ideally, a conditional generation method should provide exact samples for a broad range of conditional distributions without requiring task-specific training.To this end, we introduce the Twisted Diffusion Sampler, or TDS, a sequential Monte Carlo (SMC) algorithm that targets the conditional distributions of diffusion models. The main idea is to use twisting, an SMC technique the enjoys good computational efficiency, to incorporate heuristic approximations without compromising asymptotic exactness. We study the properties of TDS on MNIST image inpainting and class-conditional generation tasks.TDS extends to Riemannian diffusion models, which are crucial for protein modeling.When applied to the motif-scaffolding problem, a core problem in protein design, TDS enables more flexible conditioning criteria than conditionally trained models, and provides state-of-the-art success rates on 9/12 problems in a benchmark set with scaffolds shorter than 100 residues.

Chat is not available.