Skip to yearly menu bar Skip to main content


Poster

Band-limited Training and Inference for Convolutional Neural Networks

Adam Dziedzic · John Paparrizos · Sanjay Krishnan · Aaron Elmore · Michael Franklin

Pacific Ballroom #132

Keywords: [ Time Series and Sequence Models ] [ Systems and Software ] [ Others ] [ Computer Vision ] [ Approximate Inference ]


Abstract:

The convolutional layers are core building blocks of neural network architectures. In general, a convolutional filter applies to the entire frequency spectrum of the input data. We explore artificially constraining the frequency spectra of these filters and data, called band-limiting, during training. The frequency domain constraints apply to both the feed-forward and back-propagation steps. Experimentally, we observe that Convolutional Neural Networks (CNNs) are resilient to this compression scheme and results suggest that CNNs learn to leverage lower-frequency components. In particular, we found: (1) band-limited training can effectively control the resource usage (GPU and memory); (2) models trained with band-limited layers retain high prediction accuracy; and (3) requires no modification to existing training algorithms or neural network architectures to use unlike other compression schemes.

Live content is unavailable. Log in and register to view live content