Timezone: »
This paper searches for the optimal neural architecture by minimizing a proxy of validation loss. Existing neural architecture search (NAS) methods used to discover the optimal neural architecture that best fits the validation examples given the up-to-date network weights. However, back propagation with a number of validation examples could be time consuming, especially when it needs to be repeated many times in NAS. Though these intermediate validation results are invaluable, they would be wasted if we cannot use them to predict the future from the past. In this paper, we propose to approximate the validation loss landscape by learning a mapping from neural architectures to their corresponding validate losses. The optimal neural architecture thus can be easily identified as the minimum of this proxy validation loss landscape. A novel sampling strategy is further developed for an efficient approximation of the loss landscape. Theoretical analysis indicates that our sampling strategy can reach a lower error rate and a lower label complexity compared with a uniform sampling. Experimental results on benchmarks demonstrate that the architecture searched by the proposed algorithm can achieve a satisfactory accuracy with less time cost.
Author Information
Yanxi Li (University of Sydney)
Minjing Dong (The University of Sydney)
Yunhe Wang (Noah's Ark Lab, Huawei Technologies.)
Chang Xu (University of Sydney)
More from the Same Authors
-
2023 Poster: Dual Focal Loss for Calibration »
Linwei Tao · Minjing Dong · Chang Xu -
2023 Poster: PixelAsParam: A Gradient View on Diffusion Sampling with Guidance »
Anh-Dung Dinh · Daochang Liu · Chang Xu -
2022 Poster: Spatial-Channel Token Distillation for Vision MLPs »
Yanxi Li · Xinghao Chen · Minjing Dong · Yehui Tang · Yunhe Wang · Chang Xu -
2022 Spotlight: Spatial-Channel Token Distillation for Vision MLPs »
Yanxi Li · Xinghao Chen · Minjing Dong · Yehui Tang · Yunhe Wang · Chang Xu -
2022 Poster: Federated Learning with Positive and Unlabeled Data »
Xinyang Lin · Hanting Chen · Yixing Xu · Chao Xu · Xiaolin Gui · Yiping Deng · Yunhe Wang -
2022 Spotlight: Federated Learning with Positive and Unlabeled Data »
Xinyang Lin · Hanting Chen · Yixing Xu · Chao Xu · Xiaolin Gui · Yiping Deng · Yunhe Wang -
2021 Poster: Commutative Lie Group VAE for Disentanglement Learning »
Xinqi Zhu · Chang Xu · Dacheng Tao -
2021 Oral: Commutative Lie Group VAE for Disentanglement Learning »
Xinqi Zhu · Chang Xu · Dacheng Tao -
2021 Poster: Learning to Weight Imperfect Demonstrations »
Yunke Wang · Chang Xu · Bo Du · Honglak Lee -
2021 Poster: K-shot NAS: Learnable Weight-Sharing for NAS with K-shot Supernets »
Xiu Su · Shan You · Mingkai Zheng · Fei Wang · Chen Qian · Changshui Zhang · Chang Xu -
2021 Spotlight: K-shot NAS: Learnable Weight-Sharing for NAS with K-shot Supernets »
Xiu Su · Shan You · Mingkai Zheng · Fei Wang · Chen Qian · Changshui Zhang · Chang Xu -
2021 Spotlight: Learning to Weight Imperfect Demonstrations »
Yunke Wang · Chang Xu · Bo Du · Honglak Lee -
2021 Poster: Winograd Algorithm for AdderNet »
Wenshuo Li · Hanting Chen · Mingqiang Huang · Xinghao Chen · Chunjing Xu · Yunhe Wang -
2021 Spotlight: Winograd Algorithm for AdderNet »
Wenshuo Li · Hanting Chen · Mingqiang Huang · Xinghao Chen · Chunjing Xu · Yunhe Wang -
2020 Poster: Training Binary Neural Networks through Learning with Noisy Supervision »
Kai Han · Yunhe Wang · Yixing Xu · Chunjing Xu · Enhua Wu · Chang Xu -
2019 Poster: LegoNet: Efficient Convolutional Neural Networks with Lego Filters »
Zhaohui Yang · Yunhe Wang · Chuanjian Liu · Hanting Chen · Chunjing Xu · Boxin Shi · Chao Xu · Chang Xu -
2019 Oral: LegoNet: Efficient Convolutional Neural Networks with Lego Filters »
Zhaohui Yang · Yunhe Wang · Chuanjian Liu · Hanting Chen · Chunjing Xu · Boxin Shi · Chao Xu · Chang Xu -
2017 Poster: Beyond Filters: Compact Feature Map for Portable Deep Model »
Yunhe Wang · Chang Xu · Chao Xu · Dacheng Tao -
2017 Talk: Beyond Filters: Compact Feature Map for Portable Deep Model »
Yunhe Wang · Chang Xu · Chao Xu · Dacheng Tao