Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Stabilizing the Training of Consistency Models with Score Guidance

Jeongjun Lee · Jonggeon Park · Jongmin Yoon · Juho Lee

Keywords: [ Diffusion Models ] [ consistency models ]


Abstract:

Consistency models exhibit superior sample quality with few steps of sampling, even without relying on pre-trained diffusion models as teacher model. However, as the number of total discretization steps increases, they suffer from unstable training due to large variance which leads to suboptimal performance. It is known that this can be mitigated by initializing the weights with pre-trained diffusion models, which suggests the potential effectiveness of adopting diffusion models to solve the problem. Inspired by this, we introduce a transformation layer termed score head, which is trained in conjunction with the consistency model to form a larger diffusion model. Updating the consistency model additionally with the gradients coming from the score head reduces variance during training. Moreover, by training with the score head, we empirically demonstrated that the consistency model learns common features related to the score model. The sample quality of the consistency model improves accordingly when measured on CIFAR-10.

Chat is not available.