Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Hardware-aware efficient training (HAET)

Get the Random Number on the fly: A Low-Precision DNN Training Framework using Stochastic Rounding without the Random Number Generator

Geng Yuan · Sung-En Chang · Alec Lu · Jun Liu · Yanyu Li · Yushu Wu · Zhenglun Kong · Yanyue Xie · Peiyan Dong · Minghai Qin · Xiaolong Ma · Zhenman Fang · Yanzhi Wang


Abstract:

Stochastic rounding is a critical technique used in low-precision deep neural networks (DNNs) training to ensure good model accuracy. However, it requires a large number of random numbers generated on the fly. This is not a trivial task on the hardware platforms such as FPGA and ASIC.The widely used solution is to introduce random number generators with extra hardware costs. In this paper, we innovatively propose to employ the stochastic property of DNN training process itself and directly extract random numbers from DNNs in a self-sufficient manner. We propose different methods to obtain random numbers from different sources in neural networks and a generator-free framework is proposed for low-precision DNN training on a variety of deep learning tasks. Moreover, we evaluate the quality of the extracted random numbers and find that high-quality random numbers widely exist in DNNs, while their quality can even pass the NIST test suite.

Chat is not available.