# stochastic coordinate descent

• Ji Liu and Steve Wright and Christopher Re and Victor Bittorf and Srikrishna Sridhar

We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves a linear convergence rate on functions that satisfy an essential strong convexity property and a sublinear rate ($1/K$) on general convex functions. Near-linear speedup on a multicore system can be expected if the number of processors is $O(n^{1/2 • Wenliang Zhong and James Kwok ### Fast Stochastic Alternating Direction Method of Multipliers (pdf) We propose a new stochastic alternating direction method of multipliers (ADMM) algorithm, which incrementally approximates the full gradient in the linearized ADMM formulation. Besides having a low per-iteration complexity as existing stochastic ADMM algorithms, it improves the convergence rate on convex problems from$\mO(1/\sqrt{T

• Shai Shalev-Shwartz and Tong Zhang

### Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization (pdf)

We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.

2013-2014 ICML | International Conference on Machine Learning