Skip to yearly menu bar Skip to main content


Fast and Provable Nonconvex Tensor RPCA

Haiquan Qiu · Yao Wang · Shaojie Tang · Deyu Meng · QUANMING YAO

Hall E #702

Keywords: [ T: Optimization ] [ OPT: Non-Convex ] [ Optimization ]

Abstract: In this paper, we study nonconvex tensor robust principal component analysis (RPCA) based on the $t$-SVD. We first propose an alternating projection method, i.e., APT, which converges linearly to the ground-truth under the incoherence conditions of tensors. However, as the projection to the low-rank tensor space in APT can be slow, we further propose to speedup such a process by utilizing the property of the tangent space of low-rank. The resulting algorithm, i.e., EAPT, is not only more efficient than APT but also keeps the linear convergence. Compared with existing tensor RPCA works, the proposed method, especially EAPT, is not only more effective due to the recovery guarantee and adaption in the transformed (frequency) domain but also more efficient due to faster convergence rate and lower iteration complexity. These benefits are also empirically verified both on synthetic data, and real applications, e.g., hyperspectral image denoising and video background subtraction.

Chat is not available.