Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Oral
Thu Jul 12 05:20 AM -- 05:30 AM (PDT) @ A7
Is Generator Conditioning Causally Related to GAN Performance?
Augustus Odena · Jacob Buckman · Catherine Olsson · Tom B Brown · Christopher Olah · Colin Raffel · Ian Goodfellow
[ PDF [ Video

Recent work suggests that controlling the entiredistribution of Jacobian singular values is animportant design consideration in deep learning.Motivated by this, we study the distribution ofsingular values of the Jacobian of the generator inGenerative Adversarial Networks. We find thatthis Jacobian generally becomes ill-conditionedat the beginning of training. Moreover, we findthat the average (across the latent space) conditioningof the generator is highly predictiveof two other ad-hoc metrics for measuring the“quality” of trained GANs: the Inception Scoreand the Frechet Inception Distance. We thentest the hypothesis that this relationship is causalby proposing a “regularization” technique (calledJacobian Clamping) that softly penalizes the conditionnumber of the generator Jacobian. JacobianClamping improves the mean score fornearly all datasets on which we tested it. It alsogreatly reduces inter-run variance of the aforementionedscores, addressing (at least partially)one of the main criticisms of GANs.