Timezone: »
Many applications involving visual and language understanding can be effectively solved using deep neural networks. Even though these techniques achieve state-of-the-art results, it is very challenging to apply them on devices with limited memory and computational capacity such as mobile phones, smart watches and IoT. We propose a neural projection approach for training compact on-device neural networks. We introduce "projection" network that uses locality-sensitive projections to generate compact binary representations and learn small neural networks with computationally efficient operations. We design a joint optimization framework where the projection network can be trained from scratch or leverage existing larger neural networks such as feed-forward NNs, CNNs or RNNs. The trained neural projection network can be directly used for inference on device at low memory and computation cost. We demonstrate the effectiveness of this as a general-purpose approach for significantly shrinking the memory requirements of different types of neural networks while preserving good accuracy on various visual recognition and text classification tasks. We also discuss novel extensions of the approach and derive projection models for other learning scenarios and real-world on-device applications.
Author Information
Sujith Ravi (Google Research)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Efficient On-Device Models using Neural Projections »
Fri. Jun 14th 01:30 -- 04:00 AM Room Pacific Ballroom #103
More from the Same Authors
-
2019 Workshop: Joint Workshop on On-Device Machine Learning & Compact Deep Neural Network Representations (ODML-CDNNR) »
Sujith Ravi · Zornitsa Kozareva · Lixin Fan · Max Welling · Yurong Chen · Werner Bailer · Brian Kulis · Haoji Hu · Jonathan Dekhtiar · Yingyan Lin · Diana Marculescu