Score-Guided Intermediate Level Optimization: Fast Langevin Mixing for Inverse Problems

Giannis Daras · Yuval Dagan · Alexandros Dimakis · Constantinos Daskalakis

Hall E #207

Keywords: [ DL: Generative Models and Autoencoders ] [ OPT: Sampling and Optimization ] [ PM: Monte Carlo and Sampling Methods ] [ DL: Theory ]

[ Abstract ]
[ Paper PDF
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT
Spotlight presentation: Deep Learning
Wed 20 Jul 7:30 a.m. PDT — 9 a.m. PDT


We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators. This result extends the work of Hand and Voroninski from efficient inversion to efficient posterior sampling. In practice, to allow for increased expressivity, we propose to do posterior sampling in the latent space of a pre-trained generative model. To achieve that, we train a score-based model in the latent space of a StyleGAN-2 and we use it to solve inverse problems.Our framework, Score-Guided Intermediate Layer Optimization (SGILO), extends prior work by replacing the sparsity regularization with a generative prior in the intermediate layer. Experimentally, we obtain significant improvements over the previous state-of-the-art, especially in the low measurement regime.

Chat is not available.