Timezone: »

 
Poster
SGD Learns One-Layer Networks in WGANs
Qi Lei · Jason Lee · Alexandros Dimakis · Constantinos Daskalakis

Thu Jul 16 08:00 AM -- 08:45 AM & Thu Jul 16 07:00 PM -- 07:45 PM (PDT) @ None #None

Generative adversarial networks (GANs) are a widely used framework for learning generative models. Wasserstein GANs (WGANs), one of the most successful variants of GANs, require solving a minmax optimization problem to global optimality, but are in practice successfully trained using stochastic gradient descent-ascent. In this paper, we show that, when the generator is a one-layer network, stochastic gradient descent-ascent converges to a global solution with polynomial time and sample complexity.

Author Information

Qi Lei (University of Texas at Austin)
Jason Lee (Princeton)
Alex Dimakis (UT Austin)

Alex Dimakis is an Associate Professor at the Electrical and Computer Engineering department, University of Texas at Austin. He received his Ph.D. in electrical engineering and computer sciences from UC Berkeley. He received an ARO young investigator award in 2014, the NSF Career award in 2011, a Google faculty research award in 2012 and the Eli Jury dissertation award in 2008. He is the co-recipient of several best paper awards including the joint Information Theory and Communications Society Best Paper Award in 2012. His research interests include information theory, coding theory and machine learning.

Constantinos Daskalakis (MIT)

More from the Same Authors