Global Convergence of Adaptive Sensing for Principal Eigenvector Estimation
Alex Saad-Falcon ⋅ Brighton Ancelin ⋅ Justin Romberg
Abstract
We analyze a compressed variant of Oja's algorithm for estimating the principal eigenvector of the data covariance matrix using only two adaptive measurements per sample. At each iteration, we observe one measurement along the current estimate and one in a random orthogonal direction. We prove that after $t$ iterations, the expected sine-squared error to the true eigenvector is $\mathcal{O}(\lambda_1\lambda_2 d^2 / (\Delta^2 t))$, where $d$ is the ambient dimension, $\lambda_1, \lambda_2$ are the leading eigenvalues, and $\Delta = \lambda_1 - \lambda_2$ is the eigengap. We complement this with a matching information-theoretic lower bound of $\Omega(\lambda_1\lambda_2 d^2 / (\Delta^2 t))$ --- the first for compressed eigenvector estimation --- proving that the $d^2$ factor, an additional factor of $d$ compared to full-observation PCA, is the fundamental cost of compression and cannot be improved. Our analysis handles the noisy setting where the covariance has nonzero trailing eigenvalues, providing the first convergence guarantee for adaptive compressed subspace tracking beyond the noiseless case.
Successful Page Load