Timezone: »

Mixed batches and symmetric discriminators for GAN training
Thomas LUCAS · Corentin Tallec · Yann Ollivier · Jakob Verbeek

Fri Jul 13 12:30 AM -- 12:50 AM (PDT) @ A7

Generative adversarial networks (GANs) are pow-erful generative models based on providing feed-back to a generative network via a discriminatornetwork. However, the discriminator usually as-sesses individual samples. This prevents the dis-criminator from accessing global distributionalstatistics of generated samples, and often leads tomode dropping: the generator models only partof the target distribution. We propose to feedthe discriminator with mixed batches of true andfake samples, and train it to predict the ratio oftrue samples in the batch. The latter score doesnot depend on the order of samples in a batch.Rather than learning this invariance, we introducea generic permutation-invariant discriminator ar-chitecture. This architecture is provably a uni-versal approximator of all symmetric functions.Experimentally, our approach reduces mode col-lapse in GANs on two synthetic datasets, andobtains good results on the CIFAR10 and CelebAdatasets, both qualitatively and quantitatively.

Author Information

Thomas LUCAS (Inria)
Corentin Tallec (INRIA)
Yann Ollivier (Facebook Artificial Intelligence Research)
Jakob Verbeek (INRIA)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors