Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

CM-GAN: Stabilizing GAN Training with Consistency Models

Haoye Lu · Yiwei Lu · Dihong Jiang · Spencer Szabados · Sun Sun · Yaoliang Yu

Keywords: [ Probability Flow ODE ] [ GAN ] [ Diffusion ] [ Consistency Model ]


Abstract:

In recent years, generative adversarial networks (GANs) have gained attention for their ability to generate realistic images, despite being notoriously difficult to train. On the other hand, diffusion models have emerged as a promising alternative, offering stable training processes and avoiding mode collapse issues; however, their generation process is computationally expensive. To overcome this problem, Song et al. (2023) proposed consistency models (CMs) that are optimized through a novel consistency constraint induced by the underlying diffusion process. In this paper, we show that the same consistency constraint can be used to stabilize the training of GANs and alleviate the notorious mode collapse problem. In this way, we provide a method to combine the main strengths of diffusions and GANs while mitigating their major drawbacks. Additionally, as the technique can also be viewed as a method to fine-tune the consistency models using a discriminator, its performance is expected to outperform CM in general. We provide preliminary empirical results on MNIST to corroborate our claims.

Chat is not available.