Poster
in
Workshop: Sampling and Optimization in Discrete Space
Strictly Low Rank Constraint Optimization \\ --- An Asymptotically $\mathcal{O}(\frac{1}{t^2})$ Method
Mengyuan Zhang · Kai Liu
Abstract:
We study a class of non-convex and non-smooth problems with rank regularization to induce sparsity in solutions. We propose to apply the proximal gradient descent method to solve the problem and accelerate the process with a novel support set projection operation operated on the singular values of the obtained solution. We show that our algorithms are able to achieve a convergence rate of $O(\frac{1}{t^2})$, which is Nesterov's optimal convergence rate of first-order methods on smooth and convex problems, and the same convergence rate of regular accelerated proximal gradient descent method on convex problems. Also, the obtained solutions all have great sparsity and the support set of singular values of each obtained solution is shrinking during the update, which improves the interpretability of the algorithms.
Chat is not available.