Skip to yearly menu bar Skip to main content


Spotlight

Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering

Peng Wang · Huikang Liu · Anthony Man-Cho So · Laura Balzano

Room 309
[ ] [ Livestream: Visit OPT: Non-Convex ]

Abstract: The K-subspaces (KSS) method is a generalization of the K-means method for subspace clustering. In this work, we present local convergence analysis and a recovery guarantee for KSS, assuming data are generated by the semi-random union of subspaces model, where $N$ points are randomly sampled from $K \ge 2$ overlapping subspaces. We show that if the initial assignment of the KSS method lies within a neighborhood of a true clustering, it converges at a superlinear rate and finds the correct clustering within $\Theta(\log\log N)$ iterations with high probability. Moreover, we propose a thresholding inner-product based spectral method for initialization and prove that it produces a point in this neighborhood. We also present numerical results of the studied method to support our theoretical developments.

Chat is not available.