Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Improving Consistency Models with Generator-Induced Coupling

Thibaut Issenhuth · Ludovic Dos Santos · Jean-Yves Franceschi · alain rakotomamonjy

Keywords: [ consistency models ]


Abstract:

Consistency models are promising generative models as they distill the multi-step sampling of score-based diffusion in a single forward pass of a neural network.Without access to sampling trajectories of a pre-trained diffusion model, consistency training relies on proxy trajectories built on an independent coupling between the noise and data distributions.Refining this coupling is a key area of improvement to make it more adapted to the task and reduce the resulting randomness in the training process.In this work, we introduce a novel coupling associating the input noisy data with their generated output from the consistency model itself, as a proxy to the inaccessible diffusion flow output.Our affordable approach exploits the inherent capacity of consistency models to compute the transport map in a single step.We provide intuition and empirical evidence of the relevance of our generator-induced coupling (GC), which brings consistency training closer to score distillation.Consequently, our method not only accelerates consistency training convergence by significant amounts but also enhances the resulting performance.

Chat is not available.