Poster

Zoo-Tuning: Adaptive Transfer from A Zoo of Models

Yang Shu · Zhi Kou · Zhangjie Cao · Jianmin Wang · Mingsheng Long

Virtual

Keywords: [ Architectures ]

[ Abstract ]
[ Slides
[ Paper ]
[ Visit Poster at Spot A5 in Virtual World ]
Tue 20 Jul 9 a.m. PDT — 11 a.m. PDT
 
Spotlight presentation: Deep Learning Architectures
Tue 20 Jul 5 a.m. PDT — 6 a.m. PDT

Abstract:

With the development of deep networks on various large-scale datasets, a large zoo of pretrained models are available. When transferring from a model zoo, applying classic single-model-based transfer learning methods to each source model suffers from high computational cost and cannot fully utilize the rich knowledge in the zoo. We propose \emph{Zoo-Tuning} to address these challenges, which learns to adaptively transfer the parameters of pretrained models to the target task. With the learnable channel alignment layer and adaptive aggregation layer, Zoo-Tuning \emph{adaptively aggregates channel aligned pretrained parameters to derive the target model}, which simultaneously promotes knowledge transfer and adapts source models to downstream tasks. The adaptive aggregation substantially reduces the computation cost at both training and inference. We further propose lite Zoo-Tuning with the temporal ensemble of batch average gating values to reduce the storage cost at the inference time. We evaluate our approach on a variety of tasks, including reinforcement learning, image classification, and facial landmark detection. Experiment results demonstrate that the proposed adaptive transfer learning approach can more effectively and efficiently transfer knowledge from a zoo of models.

Chat is not available.