Recently, there has been a great deal of research attention on understanding the convergence behavior of first-order methods. One line of this research focuses on analyzing the convergence behavior of first-order methods using tools from continuous dynamical systems such as ordinary differential equation and different inclusions. These research results shed lights on better understanding first-order methods from a non-optimization point of view. The alternating direction method of multipliers (ADMM) is a widely used first-order method for solving optimization problems with separable structure in the variable and objective function, and it is important to investigate its behavior using these new techniques from dynamical systems. Existing works along this line have been mainly focusing on problems with smooth objective functions, which excludes many important applications that are traditionally solved by ADMM variants. In this paper, we analyze some well-known and widely used ADMM variants for nonsmooth optimization problems using the tools of differential inclusions. In particular, we analyze the convergence behavior of linearized ADMM and gradient-based ADMM for nonsmooth problems. We anticipate that these results will provide new insights on understanding ADMM for solving nonsmooth problems.
Jessica E (Peking University)
Yuren Zhou (Duke University)
Chris Junchi Li (Tencent AI Lab)
Qingyun Sun (Stanford University)
Related Events (a corresponding poster, oral, or spotlight)
2019 Poster: Differential Inclusions for Modeling Nonsmooth ADMM Variants: A Continuous Limit Theory »
Thu Jun 13th 01:30 -- 04:00 AM Room Pacific Ballroom