Timezone: »
Poster
Conditional Accelerated Lazy Stochastic Gradient Descent
Guanghui · Sebastian Pokutta · Yi Zhou · Daniel Zink
In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate O(1 / epsilon^2) improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of (Hazan and Kale, 2012) with convergence rate O(1 / epsilon^4).
Author Information
Guanghui (George)
Sebastian Pokutta (Georgia Tech)
Yi Zhou (Georgia Institute of Technology)
Daniel Zink
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Talk: Conditional Accelerated Lazy Stochastic Gradient Descent »
Tue. Aug 8th 12:48 -- 01:06 AM Room Parkside 2
More from the Same Authors
-
2019 Poster: Blended Conditonal Gradients »
Gábor Braun · Sebastian Pokutta · Dan Tu · Stephen Wright -
2019 Oral: Blended Conditonal Gradients »
Gábor Braun · Sebastian Pokutta · Dan Tu · Stephen Wright -
2017 Poster: Lazifying Conditional Gradient Algorithms »
Gábor Braun · Sebastian Pokutta · Daniel Zink -
2017 Poster: Emulating the Expert: Inverse Optimization through Online Learning »
Sebastian Pokutta · Andreas Bärmann · Oskar Schneider -
2017 Talk: Lazifying Conditional Gradient Algorithms »
Gábor Braun · Sebastian Pokutta · Daniel Zink -
2017 Talk: Emulating the Expert: Inverse Optimization through Online Learning »
Sebastian Pokutta · Andreas Bärmann · Oskar Schneider