Timezone: »
Non-negative matrix factorization is a basic tool for decomposing data into the feature and weight matrices under non-negativity constraints, and in practice is often solved in the alternating minimization framework. However, it is unclear whether such algorithms can recover the ground-truth feature matrix when the weights for different features are highly correlated, which is common in applications. This paper proposes a simple and natural alternating gradient descent based algorithm, and shows that with a mild initialization it provably recovers the ground-truth in the presence of strong correlations. In most interesting cases, the correlation can be in the same order as the highest possible. Our analysis also reveals its several favorable features including robustness to noise. We complement our theoretical results with empirical studies on semi-synthetic datasets, demonstrating its advantage over several popular methods in recovering the ground-truth.
Author Information
Yuanzhi Li (Princeton University)
Yingyu Liang (Princeton University)
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Poster: Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations »
Mon. Aug 7th 08:30 AM -- 12:00 PM Room Gallery #100
More from the Same Authors
-
2018 Poster: Make the Minority Great Again: First-Order Regret Bound for Contextual Bandits »
Zeyuan Allen-Zhu · Sebastien Bubeck · Yuanzhi Li -
2018 Oral: Make the Minority Great Again: First-Order Regret Bound for Contextual Bandits »
Zeyuan Allen-Zhu · Sebastien Bubeck · Yuanzhi Li -
2018 Poster: An Alternative View: When Does SGD Escape Local Minima? »
Bobby Kleinberg · Yuanzhi Li · Yang Yuan -
2018 Poster: The Well-Tempered Lasso »
Yuanzhi Li · Yoram Singer -
2018 Oral: The Well-Tempered Lasso »
Yuanzhi Li · Yoram Singer -
2018 Oral: An Alternative View: When Does SGD Escape Local Minima? »
Bobby Kleinberg · Yuanzhi Li · Yang Yuan -
2017 Poster: Near-Optimal Design of Experiments via Regret Minimization »
Zeyuan Allen-Zhu · Yuanzhi Li · Aarti Singh · Yining Wang -
2017 Poster: Differentially Private Clustering in High-Dimensional Euclidean Spaces »
Nina Balcan · Travis Dick · Yingyu Liang · Wenlong Mou · Hongyang Zhang -
2017 Talk: Near-Optimal Design of Experiments via Regret Minimization »
Zeyuan Allen-Zhu · Yuanzhi Li · Aarti Singh · Yining Wang -
2017 Talk: Differentially Private Clustering in High-Dimensional Euclidean Spaces »
Nina Balcan · Travis Dick · Yingyu Liang · Wenlong Mou · Hongyang Zhang -
2017 Poster: Doubly Accelerated Methods for Faster CCA and Generalized Eigendecomposition »
Zeyuan Allen-Zhu · Yuanzhi Li -
2017 Poster: Faster Principal Component Regression and Stable Matrix Chebyshev Approximation »
Zeyuan Allen-Zhu · Yuanzhi Li -
2017 Talk: Doubly Accelerated Methods for Faster CCA and Generalized Eigendecomposition »
Zeyuan Allen-Zhu · Yuanzhi Li -
2017 Talk: Faster Principal Component Regression and Stable Matrix Chebyshev Approximation »
Zeyuan Allen-Zhu · Yuanzhi Li -
2017 Poster: Follow the Compressed Leader: Faster Online Learning of Eigenvectors and Faster MMWU »
Zeyuan Allen-Zhu · Yuanzhi Li -
2017 Poster: Generalization and Equilibrium in Generative Adversarial Nets (GANs) »
Sanjeev Arora · Rong Ge · Yingyu Liang · Tengyu Ma · Yi Zhang -
2017 Talk: Follow the Compressed Leader: Faster Online Learning of Eigenvectors and Faster MMWU »
Zeyuan Allen-Zhu · Yuanzhi Li -
2017 Talk: Generalization and Equilibrium in Generative Adversarial Nets (GANs) »
Sanjeev Arora · Rong Ge · Yingyu Liang · Tengyu Ma · Yi Zhang