Timezone: »

 
Poster
First-Order Algorithms Converge Faster than $O(1/k)$ on Convex Problems
Ching-pei Lee · Stephen Wright

Wed Jun 12 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #207
It is well known that both gradient descent and stochastic coordinate descent achieve a global convergence rate of $O(1/k)$ in the objective value, when applied to a scheme for minimizing a Lipschitz-continuously differentiable, unconstrained convex function. In this work, we improve this rate to $o(1/k)$. We extend the result to proximal gradient and proximal coordinate descent on regularized problems to show similar $o(1/k)$ convergence rates. The result is tight in the sense that a rate of $O(1/k^{1+\epsilon})$ is not generally attainable for any $\epsilon>0$, for any of these methods.

Author Information

Ching-pei Lee (University of Wisconsin-Madison)
Stephen Wright (University of Wisconsin-Madison)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors