Timezone: »
In this paper, we consider the problem of Robust Matrix Completion (RMC) where the goal is to recover a low-rank matrix by observing a small number of its entries out of which a few can be arbitrarily corrupted. We propose a simple projected gradient descent-based method to estimate the low-rank matrix that alternately performs a projected gradient descent step and cleans up a few of the corrupted entries using hard-thresholding. Our algorithm solves RMC using nearly optimal number of observations while tolerating a nearly optimal number of corruptions. Our result also implies significant improvement over the existing time complexity bounds for the low-rank matrix completion problem. Finally, an application of our result to the robust PCA problem (low-rank+sparse matrix separation) leads to nearly linear time (in matrix dimensions) algorithm for the same; existing state-of-the-art methods require quadratic time. Our empirical results corroborate our theoretical results and show that even for moderate sized problems, our method for robust PCA is an order of magnitude faster than the existing methods.
Author Information
Yeshwanth Cherapanamjeri (Microsoft Research)
Prateek Jain (Microsoft Research)
Kartik Gupta (Microsoft Research)
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Talk: Nearly Optimal Robust Matrix Completion »
Mon. Aug 7th 05:30 -- 05:48 AM Room C4.4
More from the Same Authors
-
2020 : Discussion Panel »
Krzysztof Dembczynski · Prateek Jain · Alina Beygelzimer · Inderjit Dhillon · Anna Choromanska · Maryam Majzoubi · Yashoteja Prabhu · John Langford -
2020 Poster: Soft Threshold Weight Reparameterization for Learnable Sparsity »
Aditya Kusupati · Vivek Ramanujan · Raghav Somani · Mitchell Wortsman · Prateek Jain · Sham Kakade · Ali Farhadi -
2020 Poster: Optimization and Analysis of the pAp@k Metric for Recommender Systems »
Gaurush Hiranandani · Warut Vijitbenjaronk · Sanmi Koyejo · Prateek Jain -
2020 Poster: DROCC: Deep Robust One-Class Classification »
Sachin Goyal · Aditi Raghunathan · Moksh Jain · Harsha Vardhan Simhadri · Prateek Jain -
2019 Poster: SGD without Replacement: Sharper Rates for General Smooth Convex Functions »
Dheeraj Nagaraj · Prateek Jain · Praneeth Netrapalli -
2019 Oral: SGD without Replacement: Sharper Rates for General Smooth Convex Functions »
Dheeraj Nagaraj · Prateek Jain · Praneeth Netrapalli -
2018 Poster: Differentially Private Matrix Completion Revisited »
Prateek Jain · Om Dipakbhai Thakkar · Abhradeep Thakurta -
2018 Oral: Differentially Private Matrix Completion Revisited »
Prateek Jain · Om Dipakbhai Thakkar · Abhradeep Thakurta -
2017 Workshop: ML on a budget: IoT, Mobile and other tiny-ML applications »
Manik Varma · Venkatesh Saligrama · Prateek Jain -
2017 Poster: ProtoNN: Compressed and Accurate kNN for Resource-scarce Devices »
Chirag Gupta · ARUN SUGGALA · Ankit Goyal · Saurabh Goyal · Ashish Kumar · Bhargavi Paranjape · Harsha Vardhan Simhadri · Raghavendra Udupa · Manik Varma · Prateek Jain -
2017 Talk: ProtoNN: Compressed and Accurate kNN for Resource-scarce Devices »
Chirag Gupta · ARUN SUGGALA · Ankit Goyal · Saurabh Goyal · Ashish Kumar · Bhargavi Paranjape · Harsha Vardhan Simhadri · Raghavendra Udupa · Manik Varma · Prateek Jain -
2017 Poster: Recovery Guarantees for One-hidden-layer Neural Networks »
Kai Zhong · Zhao Song · Prateek Jain · Peter Bartlett · Inderjit Dhillon -
2017 Poster: Active Heteroscedastic Regression »
Kamalika Chaudhuri · Prateek Jain · Nagarajan Natarajan -
2017 Talk: Active Heteroscedastic Regression »
Kamalika Chaudhuri · Prateek Jain · Nagarajan Natarajan -
2017 Talk: Recovery Guarantees for One-hidden-layer Neural Networks »
Kai Zhong · Zhao Song · Prateek Jain · Peter Bartlett · Inderjit Dhillon