Poster
in
Workshop: Hardware-aware efficient training (HAET)
Low-Bit DNN Training with Hardware-Efficient Stochastic Rounding Unit Design
Sung-En Chang · Geng Yuan · Alec Lu · Mengshu Sun · Yanyu Li · Xiaolong Ma · Yanyue Xie · Minghai Qin · Xue Lin · Zhenman Fang · Yanzhi Wang
Abstract:
Stochastic rounding is crucial in the training of low-bit deep neural networks (DNNs) to achieve high accuracy. Unfortunately, prior studies require a large number of high-precision stochastic rounding units (SRUs) to guarantee the low-bit DNN accuracy, which involves considerable hardware overhead. In this paper, we propose an automated framework to explore hardware-efficient low-bit SRUs (ESRUs) that can still generate high-quality random numbers to guarantee the accuracy of low-bit DNN training. Experimental results using state-of-the-art DNN models demonstrate that, compared to the prior 24-bit SRU with 24-bit pseudo random number generator (PRNG), our 8-bit ESRU~with 3-bit PRNG reduces the SRU resource usage by $9.75\times$ while achieving a higher accuracy.
Chat is not available.