Timezone: »

Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling
Gregory Benton · Wesley Maddox · Sanae Lotfi · Andrew Wilson

Tue Jul 20 06:45 AM -- 06:50 AM (PDT) @

With a better understanding of the loss surfaces for multilayer networks, we can build more robust and accurate training procedures. Recently it was discovered that independently trained SGD solutions can be connected along one-dimensional paths of near-constant training loss. In this paper, we in fact demonstrate the existence of mode-connecting simplicial complexes that form multi-dimensional manifolds of low loss, connecting many independently trained models. Building on this discovery, we show how to efficiently construct simplicial complexes for fast ensembling, outperforming independently trained deep ensembles in accuracy, calibration, and robustness to dataset shift. Notably, our approach is easy to apply and only requires a few training epochs to discover a low-loss simplex.

Author Information

Gregory Benton (New York University)
Wesley Maddox (New York University)
Sanae Lotfi (New York University)
Sanae Lotfi

I am a PhD student at the Center for Data Science at NYU and a DeepMind fellow, advised by Professor Andrew Gordon Wilson. I am currently interested in designing robust models that can generalize well in and out of distribution. I also work on the closely related question of understanding and quantifying the generalization properties of deep neural networks. More broadly, my research interests include out-of-distribution generalization, Bayesian learning, probabilistic modeling, large-scale optimization, and loss surface analysis. Prior to NYU, I obtained a master’s degree in applied mathematics from Polytechnique Montreal. I was fortunate to work there with Professors Andrea Lodi and Dominique Orban to design stochastic first- and second-order algorithms with compelling theoretical and empirical properties for machine learning and large-scale optimization. I received the Best Master’s Thesis Award in Applied Mathematics at Polytechnique Montreal for this work. I also hold an engineering degree in general engineering and applied mathematics from CentraleSupélec.

Andrew Wilson (New York University)
Andrew Wilson

Andrew Gordon Wilson is faculty in the Courant Institute and Center for Data Science at NYU. His interests include probabilistic modelling, Gaussian processes, Bayesian statistics, physics inspired machine learning, and loss surfaces and generalization in deep learning. His webpage is https://cims.nyu.edu/~andrewgw.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors