Timezone: »

Talk
Compressed Sensing using Generative Models
Ashish Bora · Ajil Jalal · Eric Price · Alexandros Dimakis

Mon Aug 07 09:42 PM -- 10:00 PM (PDT) @ C4.4
The goal of compressed sensing is to estimate a vector from an underdetermined system of noisy linear measurements, by making use of prior knowledge on the structure of vectors in the relevant domain. For almost all results in this literature, the structure is represented by sparsity in a well-chosen basis. We show how to achieve guarantees similar to standard compressed sensing but without employing sparsity at all. Instead, we suppose that vectors lie near the range of a generative model $G: \R^k \to \R^n$. Our main theorem is that, if $G$ is $L$-Lipschitz, then roughly $\mathcal{O}(k \log L)$ random Gaussian measurements suffice for an $\ell_2/\ell_2$ recovery guarantee. We demonstrate our results using generative models from published variational autoencoder and generative adversarial networks. Our method can use $5$-$10$x fewer measurements than Lasso for the same accuracy.

#### Author Information

##### Ashish Bora (University of Texas at Austin)

I am interested in building theory and tools to understand and apply Machine Learning. Currently, I am a second year graduate student in the Computer Science Department at University of Texas, Austin. Prior to that, I completed my undergraduate in Electrical Engineering (Hons.) with minor in Computer Science at Indian Institute of Technology Bombay.

##### Alexandros Dimakis (UT Austin)

Alex Dimakis is an Associate Professor at the Electrical and Computer Engineering department, University of Texas at Austin. He received his Ph.D. in electrical engineering and computer sciences from UC Berkeley. He received an ARO young investigator award in 2014, the NSF Career award in 2011, a Google faculty research award in 2012 and the Eli Jury dissertation award in 2008. He is the co-recipient of several best paper awards including the joint Information Theory and Communications Society Best Paper Award in 2012. His research interests include information theory, coding theory and machine learning.