Timezone: »

An Efficient, Sparsity-Preserving, Online Algorithm for Low-Rank Approximation
David Anderson · Ming Gu

Tue Aug 08 01:30 AM -- 05:00 AM (PDT) @ Gallery #25

Low-rank matrix approximation is a fundamental tool in data analysis for processing large datasets, reducing noise, and finding important signals. In this work, we present a novel truncated LU factorization called Spectrum-Revealing LU (SRLU) for effective low-rank matrix approximation, and develop a fast algorithm to compute an SRLU factorization. We provide both matrix and singular value approximation error bounds for the SRLU approximation computed by our algorithm. Our analysis suggests that SRLU is competitive with the best low-rank matrix approximation methods, deterministic or randomized, in both computational complexity and approximation quality. Numeric experiments illustrate that SRLU preserves sparsity, highlights important data features and variables, can be efficiently updated, and calculates data approximations nearly as accurately as the best possible. To the best of our knowledge this is the first practical variant of the LU factorization for effective and efficient low-rank matrix approximation.

Author Information

David Anderson (UC Berkeley)
Ming Gu (University of California at Berkeley)

Related Events (a corresponding poster, oral, or spotlight)