Poster
Tue Aug 08 01:30 AM -- 05:00 AM (PDT) @ Gallery #40
Conditional Accelerated Lazy Stochastic Gradient Descent
In
Posters Tue
[
PDF]
[
Summary/Notes]
In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate O(1 / epsilon^2) improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of (Hazan and Kale, 2012) with convergence rate O(1 / epsilon^4).