Timezone: »

Subspace Learning for Effective Meta-Learning
Weisen Jiang · James Kwok · Yu Zhang

Tue Jul 19 08:45 AM -- 08:50 AM (PDT) @ None

Meta-learning aims to extract meta-knowledge from historical tasks to accelerate learning on new tasks. Typical meta-learning algorithms like MAML learn a globally shared meta-model for all tasks. However, when the task environments are complex, task model parameters are diverse and a common meta-model is insufficient to capture all the meta-knowledge. To address this challenge, in this paper, task model parameters are formulated into multiple subspaces, and each subspace represents one type of meta-knowledge. We propose an algorithm to learn the meta-parameters (i.e., subspace bases). We theoretically study the generalization properties of the learned subspaces. Experiments on regression and classification meta-learning datasets verify the effectiveness of the proposed algorithm.

Author Information

Weisen Jiang (Hong Kong University of Science and Technology)
James Kwok (Hong Kong University of Science and Technology)
Yu Zhang (Hong Kong University of Science and Technology)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors