Skip to yearly menu bar Skip to main content


Talk

Conditional Accelerated Lazy Stochastic Gradient Descent

Guanghui · Sebastian Pokutta · Yi Zhou · Daniel Zink

Parkside 2

Abstract:

In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate O(1 / epsilon^2) improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of (Hazan and Kale, 2012) with convergence rate O(1 / epsilon^4).

Live content is unavailable. Log in and register to view live content