Skip to yearly menu bar Skip to main content


Poster

Compressed Factorization: Fast and Accurate Low-Rank Factorization of Compressively-Sensed Data

Vatsal Sharan · Kai Sheng Tai · Peter Bailis · Gregory Valiant

Pacific Ballroom #187

Keywords: [ Matrix Factorization ] [ Sparsity and Compressed Sensing ]


Abstract:

What learning algorithms can be run directly on compressively-sensed data? In this work, we consider the question of accurately and efficiently computing low-rank matrix or tensor factorizations given data compressed via random projections. We examine the approach of first performing factorization in the compressed domain, and then reconstructing the original high-dimensional factors from the recovered (compressed) factors. In both the matrix and tensor settings, we establish conditions under which this natural approach will provably recover the original factors. While it is well-known that random projections preserve a number of geometric properties of a dataset, our work can be viewed as showing that they can also preserve certain solutions of non-convex, NP-Hard problems like non-negative matrix factorization. We support these theoretical results with experiments on synthetic data and demonstrate the practical applicability of compressed factorization on real-world gene expression and EEG time series datasets.

Live content is unavailable. Log in and register to view live content