Timezone: »

 
Oral
Zero-Shot Knowledge Distillation in Deep Networks
Gaurav Kumar Nayak · Konda Reddy Mopuri · Vaisakh Shaj · Venkatesh Babu Radhakrishnan · Anirban Chakraborty

Thu Jun 13 11:30 AM -- 11:35 AM (PDT) @ Grand Ballroom

Knowledge distillation deals with the problem of training a smaller model from a high capacity model so as to retain most of its performance. The source and target model are generally referred to as Teacher and Student model respectively. Existing approaches use either the training data or meta-data extracted from it in order to train the Student. However, accessing the dataset on which the Teacher has been trained may not always be feasible if the dataset is very large or it poses privacy or safety concerns (e.g., biometric or medical data). Therefore, in this paper, we propose a novel data-free method to train the Student from the Teacher. Without even utilizing any meta-data, we extract the Data Impressions from the parameters of the Teacher model and utilize these as surrogate for the original training data samples to transfer its learning to Student via knowledge distillation. Hence we dub our method "Zero-shot Knowledge Distillation". We demonstrate that our framework results in competitive generalization performance as achieved by the actual training data samples on multiple benchmark datasets.

Author Information

Gaurav Kumar Nayak (Indian Institute of Science)
Konda Reddy Mopuri (University of Edinburgh)
Vaisakh Shaj (University Of Lincoln)

I currently pursuing my PhD in Robot Learning under Prof Gerhard Neumann at the CLAS Lab, University Of Lincoln, UK. Before that I was a Data Scientist at the cybersecurity firm McAfee(Intel Security). Previously I worked with Intel for 2 years. I hold a post graduate degree in Machine Learning and Computing from the Indian Institute of Space Science and Technology.

Venkatesh Babu Radhakrishnan (Indian Institute of Science)
Anirban Chakraborty (Indian Institute of Science)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors