In this paper, we propose a faster stochastic alternating direction method of multipliers (ADMM) for nonconvex optimization by using a new stochastic pathintegrated differential estimator (SPIDER), called as SPIDERADMM. As a major contribution, we give a new theoretical analysis framework for nonconvex stochastic ADMM methods with providing the optimal incremental firstorder oracle (IFO) complexity. Specifically, we prove that our SPIDERADMM achieves a recordbreaking IFO complexity of $\mathcal{O}(n+n^{1/2}\epsilon^{1})$ for finding an $\epsilon$approximate solution, which improves the deterministic ADMM by a factor $\mathcal{O}(n^{1/2})$, where $n$ denotes the sample size. Based on our new analysis framework, we also prove that the existing ADMMbased nonconvex optimization algorithms, SVRGADMM and SAGAADMM, have the optimal IFO complexity as $\mathcal{O}(n+n^{2/3}\epsilon^{1})$. Thus, SPIDERADMM improves the existing stochastic ADMM methods by a factor of $\mathcal{O}(n^{1/6})$. Moreover, we extend SPIDERADMM to the online setting, and propose a faster online ADMM, \emph{i.e.}, online SPIDERADMM. Our theoretical analysis shows that the online SPIDERADMM has the IFO complexity of $\mathcal{O}(\epsilon^{\frac{3}{2}})$ for finding an $\epsilon$approximate solution, which improves the existing best results by a factor of $\mathcal{O}(\epsilon^{\frac{1}{2}})$. Finally, the experimental results on benchmark datasets validate that the proposed algorithms have faster convergence rate than the existing ADMM algorithms for nonconvex optimization.
Author Information
Feihu Huang (University of Pittsburgh)
Songcan Chen (Nanjing University of Aeronautics and Astronautics)
Heng Huang (University of Pittsburgh)
Related Events (a corresponding poster, oral, or spotlight)

2019 Poster: Faster Stochastic Alternating Direction Method of Multipliers for Nonconvex Optimization »
Tue Jun 11th 06:30  09:00 PM Room Pacific Ballroom
More from the Same Authors

2019 Poster: Demystifying Dropout »
Hongchang Gao · Jian Pei · Heng Huang 
2019 Oral: Demystifying Dropout »
Hongchang Gao · Jian Pei · Heng Huang 
2018 Poster: Faster DerivativeFree Stochastic Algorithm for Shared Memory Machines »
Bin Gu · Zhouyuan Huo · Cheng Deng · Heng Huang 
2018 Poster: Decoupled Parallel Backpropagation with Convergence Guarantee »
Zhouyuan Huo · Bin Gu · Qian Yang · Heng Huang 
2018 Oral: Decoupled Parallel Backpropagation with Convergence Guarantee »
Zhouyuan Huo · Bin Gu · Qian Yang · Heng Huang 
2018 Oral: Faster DerivativeFree Stochastic Algorithm for Shared Memory Machines »
Bin Gu · Zhouyuan Huo · Cheng Deng · Heng Huang