Skip to yearly menu bar Skip to main content


Poster

SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks at the Edge

Mahdi Nikdan · Tommaso Pegolotti · Eugenia Iofinova · Eldar Kurtic · Dan Alistarh

Exhibit Hall 1 #314

Abstract:

We provide an efficient implementation of the backpropagation algorithm, specialized to the case where the weights of the neural network being trained are sparse. Our algorithm is general, as it applies to arbitrary (unstructured) sparsity and common layer types (e.g., convolutional or linear). We provide a fast vectorized implementation on commodity CPUs, and show that it can yield speedups in end-to-end runtime experiments, both in transfer learning using already-sparsified networks, and in training sparse networks from scratch. Thus, our results provide the first support for sparse training on commodity hardware.

Chat is not available.