Session
Sparsity and Compressed Sensing 2
Linear Spectral Estimators and an Application to Phase Retrieval
Ramina Ghods · Andrew Lan · Tom Goldstein · Christoph Studer
Phase retrieval refers to the problem of recovering real- or complex-valued vectors from magnitude measurements. The best-known algorithms for this problem are iterative in nature and rely on so-called spectral initializers that provide accurate initialization vectors. We propose a novel class of estimators suitable for general nonlinear measurement systems, called linear spectral estimators (LSPEs), which can be used to compute accurate initialization vectors for phase retrieval problems. The proposed LSPEs not only provide accurate initialization vectors for noisy phase retrieval systems with structured or random measurement matrices, but also enable the derivation of sharp and nonasymptotic mean-squared error bounds. We demonstrate the efficacy of LSPEs on synthetic and real-world phase retrieval problems, and we show that our estimators significantly outperform existing methods for structured measurement systems that arise in practice.
Covariate Adjusted Precision Matrix Estimation via Nonconvex Optimization
Jinghui Chen · Pan Xu · Lingxiao Wang · Jian Ma · Quanquan Gu
We propose a nonconvex estimator for the covariate adjusted precision matrix estimation problem in the high dimensional regime, under sparsity constraints. To solve this estimator, we propose an alternating gradient descent algorithm with hard thresholding. Compared with existing methods along this line of research, which lack theoretical guarantees in optimization error and/or statistical error, the proposed algorithm not only is computationally much more efficient with a linear rate of convergence, but also attains the optimal statistical rate up to a logarithmic factor. Thorough experiments on both synthetic and real data support our theory.
Signal and Noise Statistics Oblivious Orthogonal Matching Pursuit
Sreejith Kallummil · Sheetal Kalyani
Orthogonal matching pursuit (OMP) is a widely used algorithm for recovering sparse high dimensional vectors in linear regression models. The optimal performance of OMP requires a priori knowledge of either the sparsity of regression vector or noise statistics. Both these statistics are rarely known a priori and are very difficult to estimate. In this paper, we present a novel technique called residual ratio thresholding (RRT) to operate OMP without any a priori knowledge of sparsity and noise statistics and establish finite sample and large sample support recovery guarantees for the same. Both analytical results and numerical simulations in real and synthetic data sets indicate that RRT has a performance comparable to OMP with a priori knowledge of sparsity and noise statistics.
Testing Sparsity over Known and Unknown Bases
Siddharth Barman · Arnab Bhattacharyya · Suprovat Ghoshal
Sparsity is a basic property of real vectorsthat is exploited in a wide variety of ma-chine learning applications. In this work, wedescribe property testing algorithms for spar-sity that observe a low-dimensional projec-tion of the input. We consider two settings.In the first setting, we test sparsity with re-spect to an unknown basis: given input vec-tors $y_1 ,...,y_p \in R^d$ whose concatenation ascolumns forms $Y \in R^{d \times p}$ , does $Y = AX$for matrices $A \in R^{d\times m}$ and $X \in R^{m \times p}$ suchthat each column of $X$ is $k$-sparse, or is $Y$“far” from having such a decomposition? Inthe second setting, we test sparsity with re-spect to a known basis: for a fixed design ma-trix $A \in R^{d \times m}$ , given input vector $y \in R^d$ ,is $y = Ax$ for some $k$-sparse vector $x$ or is$y$ “far” from having such a decomposition?We analyze our algorithms using tools fromhigh-dimensional geometry and probability.