Timezone: »
Oral
Essentially No Barriers in Neural Network Energy Landscape
Felix Draxler · Kambis Veschgini · Manfred Salmhofer · Fred Hamprecht
Training neural networks involves finding minima of a high-dimensional non-convex loss function. Relaxing from linear interpolations, we construct continuous paths between minima of recent neural network architectures on CIFAR10 and CIFAR100. Surprisingly, the paths are essentially flat in both the training and test landscapes. This implies that minima are perhaps best seen as points on a single connected manifold of low loss, rather than as the bottoms of distinct valleys.
Author Information
Felix Draxler (Heidelberg University)
Kambis Veschgini (University of Heidelberg)
Manfred Salmhofer (Heidelberg University)
Fred Hamprecht (Heidelberg Collaboratory for Image Processing)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: Essentially No Barriers in Neural Network Energy Landscape »
Wed. Jul 11th 04:15 -- 07:00 PM Room Hall B #122
More from the Same Authors
-
2022 Poster: The Algebraic Path Problem for Graph Metrics »
Enrique Fita SanmartĂn · Sebastian Damrich · Fred Hamprecht -
2022 Spotlight: The Algebraic Path Problem for Graph Metrics »
Enrique Fita SanmartĂn · Sebastian Damrich · Fred Hamprecht -
2019 Poster: On the Spectral Bias of Neural Networks »
Nasim Rahaman · Aristide Baratin · Devansh Arpit · Felix Draxler · Min Lin · Fred Hamprecht · Yoshua Bengio · Aaron Courville -
2019 Oral: On the Spectral Bias of Neural Networks »
Nasim Rahaman · Aristide Baratin · Devansh Arpit · Felix Draxler · Min Lin · Fred Hamprecht · Yoshua Bengio · Aaron Courville