Skip to yearly menu bar Skip to main content


Poster

Linear Convergence of Randomized Primal-Dual Coordinate Method for Large-scale Linear Constrained Convex Programming

Daoli Zhu · Lei Zhao

Virtual

Keywords: [ Optimization - Large Scale, Parallel and Distributed ] [ Parallel and Distributed Learning ] [ Large Scale Learning and Big Data ] [ Convex Optimization ] [ Computational Learning Theory ]


Abstract:

Linear constrained convex programming (LCCP) has many practical applications, including support vector machine (SVM) and machine learning portfolio (MLP) problems. We propose the randomized primal-dual coordinate (RPDC) method, a randomized coordinate extension of the first-order primal-dual method by Cohen and Zhu, 1984 and Zhao and Zhu, 2019, to solve LCCP. We randomly choose a block of variables based on the uniform distribution and apply linearization and a Bregman-like function (core function) to the selected block to obtain simple parallel primal-dual decomposition for LCCP. We establish almost surely convergence and expected O(1/t) convergence rate. Under global strong metric subregularity, we establish the linear convergence of RPDC. Finally, we discuss the implementation details of RPDC and present numerical experiments on SVM and MLP problems to verify the linear convergence.

Chat is not available.