Timezone: »
This paper formalizes the binarization operations over neural networks from a learning perspective. In contrast to classical hand crafted rules (\eg hard thresholding) to binarize full-precision neurons, we propose to learn a mapping from full-precision neurons to the target binary ones. Each individual weight entry will not be binarized independently. Instead, they are taken as a whole to accomplish the binarization, just as they work together in generating convolution features. To help the training of the binarization mapping, the full-precision neurons after taking sign operations is regarded as some auxiliary supervision signal, which is noisy but still has valuable guidance. An unbiased estimator is therefore introduced to mitigate the influence of the supervision noise. Experimental results on benchmark datasets indicate that the proposed binarization technique attains consistent improvements over baselines.
Author Information
Kai Han (Noah’s Ark Lab, Huawei Technologies)
Yunhe Wang (Noah's Ark Lab, Huawei Technologies.)
Yixing Xu (Huawei Technologies)
Chunjing Xu (Huawei Noah's Ark Lab)
Enhua Wu (CAS)
Chang Xu (University of Sydney)
More from the Same Authors
-
2020 Poster: Neural Architecture Search in A Proxy Validation Loss Landscape »
Yanxi Li · Minjing Dong · Yunhe Wang · Chang Xu -
2019 Poster: LegoNet: Efficient Convolutional Neural Networks with Lego Filters »
Zhaohui Yang · Yunhe Wang · Chuanjian Liu · Hanting Chen · Chunjing Xu · Boxin Shi · Chao Xu · Chang Xu -
2019 Oral: LegoNet: Efficient Convolutional Neural Networks with Lego Filters »
Zhaohui Yang · Yunhe Wang · Chuanjian Liu · Hanting Chen · Chunjing Xu · Boxin Shi · Chao Xu · Chang Xu -
2017 Poster: Beyond Filters: Compact Feature Map for Portable Deep Model »
Yunhe Wang · Chang Xu · Chao Xu · Dacheng Tao -
2017 Talk: Beyond Filters: Compact Feature Map for Portable Deep Model »
Yunhe Wang · Chang Xu · Chao Xu · Dacheng Tao