Skip to yearly menu bar Skip to main content


Poster

Bridging the Gap Between f-GANs and Wasserstein GANs

Jiaming Song · Stefano Ermon

Keywords: [ Deep Learning - Generative Models and Autoencoders ] [ Unsupervised and Semi-supervised Learning ] [ Generative Adversarial Networks ] [ Deep Generative Models ]


Abstract:

Generative adversarial networks (GANs) variants approximately minimize divergences between the model and the data distribution using a discriminator. Wasserstein GANs (WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator does not provide an estimate for the ratio between model and data densities, which is useful in applications such as inverse reinforcement learning. To overcome this limitation, we propose an new training objective where we additionally optimize over a set of importance weights over the generated samples. By suitably constraining the feasible set of importance weights, we obtain a family of objectives which includes and generalizes the original f-GAN and WGAN objectives. We show that a natural extension outperforms WGANs while providing density ratios as in f-GAN, and demonstrate empirical success on distribution modeling, density ratio estimation and image generation.

Chat is not available.