Skip to yearly menu bar Skip to main content


Poster

Robust One-Bit Recovery via ReLU Generative Networks: Near-Optimal Statistical Rate and Global Landscape Analysis

Shuang Qiu · Xiaohan Wei · Zhuoran Yang

Keywords: [ Non-convex Optimization ] [ Optimization ] [ Optimization - Non-convex ]


Abstract: We study the robust one-bit compressed sensing problem whose goal is to design an algorithm that faithfully recovers any sparse target vector θ0Rdθ0Rd \textit{uniformly} via mm quantized noisy measurements. Specifically, we consider a new framework for this problem where the sparsity is implicitly enforced via mapping a low dimensional representation x0Rkx0Rk through a known nn-layer ReLU generative network G:RkRdG:RkRd such that θ0=G(x0)θ0=G(x0). Such a framework poses low-dimensional priors on θ0θ0 without a known sparsity basis. We propose to recover the target G(x0)G(x0) solving an unconstrained empirical risk minimization (ERM). Under a weak \textit{sub-exponential measurement assumption}, we establish a joint statistical and computational analysis. In particular, we prove that the ERM estimator in this new framework achieves a statistical rate of m=˜O(knlogd/ε2)m=˜O(knlogd/ε2) recovering any G(x0)G(x0) uniformly up to an error εε. When the network is shallow (i.e., nn is small), we show this rate matches the information-theoretic lower bound up to logarithm factors on ε1ε1. From the lens of computation, we prove that under proper conditions on the network weights, our proposed empirical risk, despite non-convexity, has no stationary point outside of small neighborhoods around the true representation x0x0 and its negative multiple; furthermore, we show that the global minimizer of the empirical risk stays within the neighborhood around x0x0 rather than its negative multiple under further assumptions on weights.

Chat is not available.