Timezone: »

Training Binary Neural Networks through Learning with Noisy Supervision
Kai Han · Yunhe Wang · Yixing Xu · Chunjing Xu · Enhua Wu · Chang Xu

Tue Jul 14 07:00 AM -- 07:45 AM & Tue Jul 14 08:00 PM -- 08:45 PM (PDT) @ None #None

This paper formalizes the binarization operations over neural networks from a learning perspective. In contrast to classical hand crafted rules (\eg hard thresholding) to binarize full-precision neurons, we propose to learn a mapping from full-precision neurons to the target binary ones. Each individual weight entry will not be binarized independently. Instead, they are taken as a whole to accomplish the binarization, just as they work together in generating convolution features. To help the training of the binarization mapping, the full-precision neurons after taking sign operations is regarded as some auxiliary supervision signal, which is noisy but still has valuable guidance. An unbiased estimator is therefore introduced to mitigate the influence of the supervision noise. Experimental results on benchmark datasets indicate that the proposed binarization technique attains consistent improvements over baselines.

Author Information

Kai Han (Noah’s Ark Lab, Huawei Technologies)
Yunhe Wang (Noah's Ark Lab, Huawei Technologies.)
Yixing Xu (Huawei Technologies)
Chunjing Xu (Huawei Noah's Ark Lab)
Enhua Wu (CAS)
Chang Xu (University of Sydney)

More from the Same Authors