Timezone: »

 
Poster
A simpler approach to accelerated optimization: iterative averaging meets optimism
Pooria Joulani · Anant Raj · Andras Gyorgy · Csaba Szepesvari

Tue Jul 14 10:00 AM -- 10:45 AM & Tue Jul 14 11:00 PM -- 11:45 PM (PDT) @

Recently there have been several attempts to extend Nesterov's accelerated algorithm to smooth stochastic and variance-reduced optimization. In this paper, we show that there is a simpler approach to acceleration: applying optimistic online learning algorithms and querying the gradient oracle at the online average of the intermediate optimization iterates. In particular, we tighten a recent result of Cutkosky (2019) to demonstrate theoretically that online iterate averaging results in a reduced optimization gap, independently of the algorithm involved. We show that carefully combining this technique with existing generic optimistic online learning algorithms yields the optimal accelerated rates for optimizing strongly-convex and non-strongly-convex, possibly composite objectives, with deterministic as well as stochastic first-order oracles. We further extend this idea to variance-reduced optimization. Finally, we also provide ``universal'' algorithms that achieve the optimal rate for smooth and non-smooth composite objectives simultaneously without further tuning, generalizing the results of Kavis et al. (2019) and solving a number of their open problems.

Author Information

Pooria Joulani (DeepMind)
Anant Raj (Max-Planck Institute for Intelligent Systems)

Marie-Curie Fellow

Andras Gyorgy (DeepMind)
Csaba Szepesvari (DeepMind/University of Alberta)

More from the Same Authors