Timezone: »

Score-Guided Intermediate Level Optimization: Fast Langevin Mixing for Inverse Problems
Giannis Daras · Yuval Dagan · Alexandros Dimakis · Constantinos Daskalakis

Wed Jul 20 08:55 AM -- 09:00 AM (PDT) @ Ballroom 1 & 2

We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators. We extend the work of Hand and Voroninski from efficient inversion to efficient posterior sampling. Our theoretical analysis reveals a relationship between the latent dimension and the stationary distribution that the Langevin Algorithm mixes to. Motivated by our findings, we train a score-based model in the low-dimensional latent space of a (possibly random) generator and use it to solve inverse problems. Our Score-Guided Intermediate Layer Optimization (SGILO) framework extends prior work by replacing the sparsity regularization with a generative prior in the intermediate layer. Experimentally, we obtain significant improvements over the previous state-of-the-art, especially when the measurements are scarce. SGILO requires much fewer steps and is considerably faster.

Author Information

Giannis Daras (The University of Texas at Austin)
Yuval Dagan (MIT)
Alexandros Dimakis (UT Austin)

Alex Dimakis is an Associate Professor at the Electrical and Computer Engineering department, University of Texas at Austin. He received his Ph.D. in electrical engineering and computer sciences from UC Berkeley. He received an ARO young investigator award in 2014, the NSF Career award in 2011, a Google faculty research award in 2012 and the Eli Jury dissertation award in 2008. He is the co-recipient of several best paper awards including the joint Information Theory and Communications Society Best Paper Award in 2012. His research interests include information theory, coding theory and machine learning.

Constantinos Daskalakis (MIT)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors