Timezone: »

Learning to Learn from APIs: Black-Box Data-Free Meta-Learning
Zixuan Hu · Li Shen · Zhenyi Wang · Baoyuan Wu · Chun Yuan · Dacheng Tao

Tue Jul 25 02:00 PM -- 04:30 PM (PDT) @ Exhibit Hall 1 #101
Event URL: https://github.com/Egg-Hu/BiDf-MKD »

Data-free meta-learning (DFML) aims to enable efficient learning of new tasks by meta-learning from a collection of pre-trained models without access to the training data. Existing DFML work can only meta-learn from (i) white-box and (ii) small-scale pre-trained models (iii) with the same architecture, neglecting the more practical setting where the users only have inference access to the APIs with arbitrary model architectures and model scale inside. To solve this issue, we propose a Bi-level Data-free Meta Knowledge Distillation (BiDf-MKD) framework to transfer more general meta knowledge from a collection of black-box APIs to one single meta model. Specifically, by just querying APIs, we inverse each API to recover its training data via a zero-order gradient estimator and then perform meta-learning via a novel bi-level meta knowledge distillation structure, in which we design a boundary query set recovery technique to recover a more informative query set near the decision boundary. In addition, to encourage better generalization within the setting of limited API budgets, we propose task memory replay to diversify the underlying task distribution by covering more interpolated tasks. Extensive experiments in various real-world scenarios show the superior performance of our BiDf-MKD framework.

Author Information

Zixuan Hu (Tsinghua University)
Li Shen (JD Explore Academy)
Zhenyi Wang (University at Buffalo)
Baoyuan Wu (The Chinese University of Hong Kong, Shenzhen)
Chun Yuan (Graduate School at Shenzhen,Tsinghua University)
Dacheng Tao

More from the Same Authors