Timezone: »

 
Oral
Is Generator Conditioning Causally Related to GAN Performance?
Augustus Odena · Jacob Buckman · Catherine Olsson · Tom B Brown · Christopher Olah · Colin Raffel · Ian Goodfellow

Thu Jul 12 05:20 AM -- 05:30 AM (PDT) @ A7

Recent work suggests that controlling the entiredistribution of Jacobian singular values is animportant design consideration in deep learning.Motivated by this, we study the distribution ofsingular values of the Jacobian of the generator inGenerative Adversarial Networks. We find thatthis Jacobian generally becomes ill-conditionedat the beginning of training. Moreover, we findthat the average (across the latent space) conditioningof the generator is highly predictiveof two other ad-hoc metrics for measuring the“quality” of trained GANs: the Inception Scoreand the Frechet Inception Distance. We thentest the hypothesis that this relationship is causalby proposing a “regularization” technique (calledJacobian Clamping) that softly penalizes the conditionnumber of the generator Jacobian. JacobianClamping improves the mean score fornearly all datasets on which we tested it. It alsogreatly reduces inter-run variance of the aforementionedscores, addressing (at least partially)one of the main criticisms of GANs.

Author Information

Augustus Odena (Google Brain)
Jacob Buckman (Google)
Catherine Olsson (Google Brain)
Tom B Brown (Google Brain)
Christopher Olah (Google Brain)

I want to understand things clearly and explain them well. Research scientist on [Google Brain](http://g.co/brain), co-editor of [Distill](http://distill.pub).

Colin Raffel (Google)
Ian Goodfellow (Google Brain)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors