Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Hardware-aware efficient training (HAET)

Energy-aware Network Operator Search in Deep Neural Networks

Shamma Nasrin


Abstract:

This work proposes a novel Energy-aware Network Operator Search (ENOS) approach to address the energy-accuracy trade-offs of a deep neural network (DNN) accelerator. The proposed ENOS framework allows an optimal layer-wise integration of inference operators with optimal precision to maintain high prediction accuracy along with high energy efficiency. The search is formulated as a continuous optimization problem, solvable using gradient descent methods, thereby minimally increasing the training cost when learning both layer-wise inference operators and weights. We discuss multiply-accumulate (MAC) cores for digital spatial architectures that can be reconfigured to different operators and varying computing precision. ENOS training methods with single and bi-level optimization objectives are discussed and compared. We also discuss a sequential operator assignment strategy and a stochastic mode of ENOS. ENOS, characterized on ShuffleNet and SqueezeNet using CIFAR10 and CIFAR100, improves accuracy by 10--20% compared to the conventional uni-operator approaches and by 3-5% compared to mixed-precision uni-operator implementations for the same energy budget.

Chat is not available.