Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Dynamic Neural Networks

Neural Architecture Search with Loss Flatness-aware Measure

Joonhyun Jeong · Joonsang Yu · Dongyoon Han · YoungJoon Yoo


Abstract:

We propose a new proxy measure for Neural Architecture Search (NAS) focusing on the flatness of loss surface. One step forward to the existing NAS studies utilizing the validation-set accuracy or angle which measures convergence speed during training, we claim that the flatness of the loss surface can be a promising proxy for predicting the generalization capability of neural network architectures.

Chat is not available.