Poster

Functional Space Analysis of Local GAN Convergence

Valentin Khrulkov · Artem Babenko · Ivan Oseledets

Keywords: [ Deep Learning Theory ]

[ Abstract ]
[ Paper ]
[ Visit Poster at Spot C1 in Virtual World ]
Wed 21 Jul 9 a.m. PDT — 11 a.m. PDT
 
Spotlight presentation: Deep Learning Theory 2
Wed 21 Jul 5 a.m. PDT — 6 a.m. PDT

Abstract:

Recent work demonstrated the benefits of studying continuous-time dynamics governing the GAN training. However, this dynamics is analyzed in the model parameter space, which results in finite-dimensional dynamical systems. We propose a novel perspective where we study the local dynamics of adversarial training in the general functional space and show how it can be represented as a system of partial differential equations. Thus, the convergence properties can be inferred from the eigenvalues of the resulting differential operator. We show that these eigenvalues can be efficiently estimated from the target dataset before training. Our perspective reveals several insights on the practical tricks commonly used to stabilize GANs, such as gradient penalty, data augmentation, and advanced integration schemes. As an immediate practical benefit, we demonstrate how one can a priori select an optimal data augmentation strategy for a particular generation task.

Chat is not available.