Timezone: »

 
Spotlight
Frustratingly Easy Transferability Estimation
Long-Kai Huang · Ying WEI · Yu Rong · Qiang Yang · Junzhou Huang

Tue Jul 19 07:35 AM -- 07:40 AM (PDT) @ None

Transferability estimation has been an essential tool in selecting a pre-trained model and the layers in it for transfer learning, to transfer, so as to maximize the performance on a target task and prevent negative transfer. Existing estimation algorithms either require intensive training on target tasks or have difficulties in evaluating the transferability between layers. To this end, we propose a simple, efficient, and effective transferability measure named TransRate. Through a single pass over examples of a target task, TransRate measures the transferability as the mutual information between features of target examples extracted by a pre-trained model and labels of them. We overcome the challenge of efficient mutual information estimation by resorting to coding rate that serves as an effective alternative to entropy. From the perspective of feature representation, the resulting TransRate evaluates both completeness (whether features contain sufficient information of a target task) and compactness (whether features of each class are compact enough for good generalization) of pre-trained features. Theoretically, we have analyzed the close connection of TransRate to the performance after transfer learning. Despite its extraordinary simplicity in 10 lines of codes, TransRate performs remarkably well in extensive evaluations on 26 pre-trained models and 16 downstream tasks.

Author Information

Long-Kai Huang (Tencent AI Lab)
Ying WEI (City University of Hong Kong)
Yu Rong (Tencent AI Lab)
Qiang Yang (Hong Kong UST)
Junzhou Huang (University of Texas at Arlington / Tencent AI Lab)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors