Poster
in
Workshop: Dynamic Neural Networks
Vote for Nearest Neighbors Meta-Pruning of Self-Supervised Networks
Haiyan Zhao · Tianyi Zhou · Guodong Long · Jing Jiang · Chengqi Zhang
Pruning plays an essential role in deploying deep neural nets (DNNs) to the hardware of limited memory or computation. However, current high-quality iterative pruning can create a terrible carbon footprint when compressing a large DNN for a wide variety of devices and tasks. Can we reuse the pruning results on previous tasks to accelerate the pruning for a new task? Can we find a better initialization for a new task? We study this nearest neighbors meta-pruning'' problem by first investigating different choices of pre-trained models for pruning under limited iterations. Our empirical study reveals several advantages of the self-supervision pre-trained model when pruned for multiple tasks. We further study the overlap of pruned models for similar tasks. Inspired by these discoveries, we develop
Meta-Vote Pruning (MVP)'' to dynamically vote for all filters in the pre-trained model according to their chance of being selected by pruned models for tasks similar to it, so that a better initialized sub-network is generated for a new task. In experiments, we demonstrate the advantages of MVP by extensive empirical studies and comparisons with popular pruning methods.