Skip to yearly menu bar Skip to main content


Poster
in
Workshop: HiLD: High-dimensional Learning Dynamics Workshop

Hyperparameter Tuning using Loss Landscape

Jianlong Chen · Qinxue Cao · Yefan Zhou · Konstantin Schürholt · Yaoqing Yang


Abstract:

Hyperparameter tuning is crucial in training deep neural networks. In this paper, we propose a new method for hyperparameter tuning by (1) measuring multiple metrics related to the structure of the loss landscape, (2) empirically determining the "phase" of the loss landscape, and (3) providing an efficient way of hyperparameter tuning using the phase information. We demonstrate the effectiveness of our method through extensive experiments on network pruning tasks. We show that our method, named TempTuner, achieves significantly lower search time than both conventional grid search and more advanced sequential model-based Bayesian optimization (SMBO). To the best of our knowledge, this is the first work to apply loss landscape analysis to the novel application of hyperparameter tuning in neural networks.

Chat is not available.