Timezone: »
Poster
Lazifying Conditional Gradient Algorithms
Gábor Braun · Sebastian Pokutta · Daniel Zink
Conditional gradient algorithms (also often called Frank-Wolfe algorithms) are popular due to their simplicity of only requiring a linear optimization oracle and more recently they also gained significant traction for online learning. While simple in principle, in many cases the actual implementation of the linear optimization oracle is costly. We show a general method to lazify various conditional gradient algorithms, which in actual computations leads to several orders of magnitude of speedup in wall-clock time. This is achieved by using a faster separation oracle instead of a linear optimization oracle, relying only on few linear optimization oracle calls.
Author Information
Gábor Braun (Georgia Institute of Technology)
Sebastian Pokutta (Georgia Tech)
Daniel Zink
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Talk: Lazifying Conditional Gradient Algorithms »
Tue. Aug 8th 12:30 -- 12:48 AM Room Parkside 2
More from the Same Authors
-
2019 Poster: Blended Conditonal Gradients »
Gábor Braun · Sebastian Pokutta · Dan Tu · Stephen Wright -
2019 Oral: Blended Conditonal Gradients »
Gábor Braun · Sebastian Pokutta · Dan Tu · Stephen Wright -
2017 Poster: Conditional Accelerated Lazy Stochastic Gradient Descent »
Guanghui · Sebastian Pokutta · Yi Zhou · Daniel Zink -
2017 Poster: Emulating the Expert: Inverse Optimization through Online Learning »
Sebastian Pokutta · Andreas Bärmann · Oskar Schneider -
2017 Talk: Conditional Accelerated Lazy Stochastic Gradient Descent »
Guanghui · Sebastian Pokutta · Yi Zhou · Daniel Zink -
2017 Talk: Emulating the Expert: Inverse Optimization through Online Learning »
Sebastian Pokutta · Andreas Bärmann · Oskar Schneider