Skip to yearly menu bar Skip to main content


Efficient Nonconvex Regularized Tensor Completion with Structure-aware Proximal Iterations

Quanming Yao · James Kwok · Bo Han

Pacific Ballroom #211

Keywords: [ Tensor Methods ] [ Matrix Factorization ]


Nonconvex regularizers have been successfully used in low-rank matrix learning. In this paper, we extend this to the more challenging problem of low-rank tensor completion. Based on the proximal average algorithm, we develop an efficient solver that avoids expensive tensor folding and unfolding. A special ``sparse plus low-rank" structure, which is essential for fast computation of individual proximal steps, is maintained throughout the iterations. We also incorporate adaptive momentum to further speed up empirical convergence. Convergence results to critical points are provided under smoothness and Kurdyka-Lojasiewicz conditions. Experimental results on a number of synthetic and real-world data sets show that the proposed algorithm is more efficient in both time and space, and is also more accurate than existing approaches.

Live content is unavailable. Log in and register to view live content