Timezone: »

Conditional Accelerated Lazy Stochastic Gradient Descent
Guanghui · Sebastian Pokutta · Yi Zhou · Daniel Zink

Mon Aug 07 05:48 PM -- 06:06 PM (PDT) @ Parkside 2

In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate O(1 / epsilon^2) improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of (Hazan and Kale, 2012) with convergence rate O(1 / epsilon^4).

Author Information

Guanghui (George)
Sebastian Pokutta (Georgia Tech)
Yi Zhou (Georgia Institute of Technology)
Daniel Zink

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors