Oral
Faster Stochastic Alternating Direction Method of Multipliers for Nonconvex Optimization
Feihu Huang · Songcan Chen · Heng Huang

Tue Jun 11th 11:25 -- 11:30 AM @ Room 104

In this paper, we propose a faster stochastic alternating direction method of multipliers (ADMM) for nonconvex optimization by using a new stochastic path-integrated differential estimator (SPIDER), called as SPIDER-ADMM. As a major contribution, we give a new theoretical analysis framework for nonconvex stochastic ADMM methods with providing the optimal incremental first-order oracle (IFO) complexity. Specifically, we prove that our SPIDER-ADMM achieves a record-breaking IFO complexity of $\mathcal{O}(n+n^{1/2}\epsilon^{-1})$ for finding an $\epsilon$-approximate solution, which improves the deterministic ADMM by a factor $\mathcal{O}(n^{1/2})$, where $n$ denotes the sample size. Based on our new analysis framework, we also prove that the existing ADMM-based non-convex optimization algorithms, SVRG-ADMM and SAGA-ADMM, have the optimal IFO complexity as $\mathcal{O}(n+n^{2/3}\epsilon^{-1})$. Thus, SPIDER-ADMM improves the existing stochastic ADMM methods by a factor of $\mathcal{O}(n^{1/6})$. Moreover, we extend SPIDER-ADMM to the online setting, and propose a faster online ADMM, \emph{i.e.}, online SPIDER-ADMM. Our theoretical analysis shows that the online SPIDER-ADMM has the IFO complexity of $\mathcal{O}(\epsilon^{-\frac{3}{2}})$ for finding an $\epsilon$-approximate solution, which improves the existing best results by a factor of $\mathcal{O}(\epsilon^{\frac{1}{2}})$. Finally, the experimental results on benchmark datasets validate that the proposed algorithms have faster convergence rate than the existing ADMM algorithms for nonconvex optimization.

Author Information

Feihu Huang (University of Pittsburgh)
Songcan Chen (Nanjing University of Aeronautics and Astronautics)
Heng Huang (University of Pittsburgh)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors