Timezone: »
Accelerated algorithms have broad applications in large-scale optimization, due to their generality and fast convergence. However, their stability in the practical setting of noise-corrupted gradient oracles is not well-understood. This paper provides two main technical contributions: (i) a new accelerated method AGDP that generalizes Nesterov's AGD and improves on the recent method AXGD (Diakonikolas & Orecchia, 2018), and (ii) a theoretical study of accelerated algorithms under noisy and inexact gradient oracles, which is supported by numerical experiments. This study leverages the simplicity of AGDP and its analysis to clarify the interaction between noise and acceleration and to suggest modifications to the algorithm that reduce the mean and variance of the error incurred due to the gradient noise.
Author Information
Michael Cohen
Jelena Diakonikolas (Boston University)
Jelena Diakonikolas is a Postdoctoral Associate at Boston University. She completed her Ph.D. degree in electrical engineering at Columbia University. Her research interests include large-scale optimization with a focus on first-order methods and their applications in engineered and networked systems. She is a recipient of a Simons-Berkeley Research Fellowship (2018), the Morton B. Friedman Prize for Excellence at Columbia Engineering (2017), and a Qualcomm Innovation Fellowship (2015). In 2016, she was featured on the inaugural N^2 Women list of “10 Women in Networking/Communications That You Should Watch”.
Orecchia Lorenzo (Boston)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: On Acceleration with Noise-Corrupted Gradients »
Thu. Jul 12th 04:15 -- 07:00 PM Room Hall B #220
More from the Same Authors
-
2018 Poster: Alternating Randomized Block Coordinate Descent »
Jelena Diakonikolas · Orecchia Lorenzo -
2018 Oral: Alternating Randomized Block Coordinate Descent »
Jelena Diakonikolas · Orecchia Lorenzo -
2017 Poster: Connected Subgraph Detection with Mirror Descent on SDPs »
Cem Aksoylar · Orecchia Lorenzo · Venkatesh Saligrama -
2017 Talk: Connected Subgraph Detection with Mirror Descent on SDPs »
Cem Aksoylar · Orecchia Lorenzo · Venkatesh Saligrama