Timezone: »

Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling
Gregory Benton · Wesley Maddox · Sanae Lotfi · Andrew Wilson

Tue Jul 20 06:45 AM -- 06:50 AM (PDT) @ None

With a better understanding of the loss surfaces for multilayer networks, we can build more robust and accurate training procedures. Recently it was discovered that independently trained SGD solutions can be connected along one-dimensional paths of near-constant training loss. In this paper, we in fact demonstrate the existence of mode-connecting simplicial complexes that form multi-dimensional manifolds of low loss, connecting many independently trained models. Building on this discovery, we show how to efficiently construct simplicial complexes for fast ensembling, outperforming independently trained deep ensembles in accuracy, calibration, and robustness to dataset shift. Notably, our approach is easy to apply and only requires a few training epochs to discover a low-loss simplex.

Author Information

Gregory Benton (New York University)
Wesley Maddox (New York University)
Sanae Lotfi (New York University)
Andrew Wilson (New York University)
Andrew Wilson

Andrew Gordon Wilson is faculty in the Courant Institute and Center for Data Science at NYU. His interests include probabilistic modelling, Gaussian processes, Bayesian statistics, physics inspired machine learning, and loss surfaces and generalization in deep learning. His webpage is https://cims.nyu.edu/~andrewgw.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors