Skip to yearly menu bar Skip to main content


Poster

Inductive Matrix Completion: No Bad Local Minima and a Fast Algorithm

Pini Zilber · Boaz Nadler

Hall E #603

Keywords: [ T: Optimization ] [ MISC: Unsupervised and Semi-supervised Learning ] [ OPT: Global Optimization ] [ OPT: Non-Convex ]


Abstract:

The inductive matrix completion (IMC) problem is to recover a low rank matrix from few observed entries while incorporating prior knowledge about its row and column subspaces. In this work, we make three contributions to the IMC problem: (i) we prove that under suitable conditions, the IMC optimization landscape has no bad local minima; (ii) we derive a simple scheme with theoretical guarantees to estimate the rank of the unknown matrix; and (iii) we propose GNIMC, a simple Gauss-Newton based method to solve the IMC problem, analyze its runtime and derive for it strong recovery guarantees. The guarantees for GNIMC are sharper in several aspects than those available for other methods, including a quadratic convergence rate, fewer required observed entries and stability to errors or deviations from low-rank. Empirically, given entries observed uniformly at random, GNIMC recovers the underlying matrix substantially faster than several competing methods.

Chat is not available.