Timezone: »
Choosing the optimizer is considered to be among the most crucial design decisions in deep learning, and it is not an easy one. The growing literature now lists hundreds of optimization methods. In the absence of clear theoretical guidance and conclusive empirical evidence, the decision is often made based on anecdotes. In this work, we aim to replace these anecdotes, if not with a conclusive ranking, then at least with evidence-backed heuristics. To do so, we perform an extensive, standardized benchmark of fifteen particularly popular deep learning optimizers while giving a concise overview of the wide range of possible choices. Analyzing more than 50,000 individual runs, we contribute the following three points: (i) Optimizer performance varies greatly across tasks. (ii) We observe that evaluating multiple optimizers with default parameters works approximately as well as tuning the hyperparameters of a single, fixed optimizer. (iii) While we cannot discern an optimization method clearly dominating across all tested tasks, we identify a significantly reduced subset of specific optimizers and parameter choices that generally lead to competitive results in our experiments: Adam remains a strong contender, with newer methods failing to significantly and consistently outperform it. Our open-sourced results are available as challenging and well-tuned baselines for more meaningful evaluations of novel optimization methods without requiring any further computational efforts.
Author Information
Robin M Schmidt (University of Tübingen)
Frank Schneider (University of Tübingen)
Philipp Hennig (University of Tübingen)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Poster: Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers »
Wed. Jul 21st 04:00 -- 06:00 PM Room
More from the Same Authors
-
2021 Poster: High-Dimensional Gaussian Process Inference with Derivatives »
Filip de Roos · Alexandra Gessner · Philipp Hennig -
2021 Spotlight: High-Dimensional Gaussian Process Inference with Derivatives »
Filip de Roos · Alexandra Gessner · Philipp Hennig -
2021 Poster: Bayesian Quadrature on Riemannian Data Manifolds »
Christian Fröhlich · Alexandra Gessner · Philipp Hennig · Bernhard Schölkopf · Georgios Arvanitidis -
2021 Spotlight: Bayesian Quadrature on Riemannian Data Manifolds »
Christian Fröhlich · Alexandra Gessner · Philipp Hennig · Bernhard Schölkopf · Georgios Arvanitidis -
2018 Poster: Dissecting Adam: The Sign, Magnitude and Variance of Stochastic Gradients »
Lukas Balles · Philipp Hennig -
2018 Oral: Dissecting Adam: The Sign, Magnitude and Variance of Stochastic Gradients »
Lukas Balles · Philipp Hennig