Timezone: »

 
On the Subspace Structure of Gradient-Based Meta-Learning
Gustaf Tegnér · Alfredo Reichlin · Hang Yin · Mårten Björkman · Danica Kragic
Event URL: https://openreview.net/forum?id=OhF1atroYmE »

In this work we provide an analysis of the distribution of the post-adaptation parameters of Gradient-Based Meta-Learning (GBML) methods. Previous work has noticed how, for the case of image-classification, this adaption only takes place on the last layers of the network. We propose the more general notion that parameters are updated over a low-dimensional subspace of the same dimensionality as the task-space and show that this holds for regression as well. Furthermore, the induced subspace structure provides a method to estimate the intrinsic dimension of the space of tasks of common few-shot learning datasets.

Author Information

Gustaf Tegnér (KTH Royal Institute of Technology)
Alfredo Reichlin (KTH Royal Institute of Technology)
Hang Yin (KTH)
Mårten Björkman (KTH Royal Institute of Technology, Stockholm, Sweden)
Danica Kragic (KTH)

More from the Same Authors