Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling
Bidirectional Consistency Models
Liangchen Li · Jiajun He
Keywords: [ Generative Models ] [ Diffusion Models ] [ Bidirectional Consistency Models ] [ consistency trajectory models ] [ Consistency Training ] [ consistency models ] [ inversion ]
Diffusion models (DMs) are capable of generating remarkably high-quality samples by iteratively denoising a random vector, a process corresponding to moving along the probability flow ordinary differential equation (PF ODE). Interestingly, DMs can also invert an input image to noise by moving backward along the PF ODE, a key operation for downstream tasks such as interpolation and image editing. However, the iterative nature of this process restricts its speed. Recently, Consistency Models (CMs) have emerged to address this challenge by approximating the integral of the PF ODE, largely reducing the number of iterations in generation. Yet, the absence of an explicit ODE solver complicates the inversion process. To address this limitation, we introduce the Bidirectional Consistency Model (BCM), which learns a single neural network that enables both forward and backward traversal along the PF ODE, unifying generation and inversion tasks within one framework. Our proposed method supports one-step generation and inversion while allowing the use of additional steps to enhance generation quality or reduce reconstruction error. Its bidirectional consistency also broadens its applications, allowing, for instance, interpolation between two real images - a task beyond the reach of previous CMs.