Skip to yearly menu bar Skip to main content


Learning to Learn Kernels with Variational Random Features

Xiantong Zhen · Haoliang Sun · Yingjun Du · Jun Xu · Yilong Yin · Ling Shao · Cees Snoek


Keywords: [ Graphical Models ] [ Kernel Methods ] [ Meta-learning and Automated ML ] [ Transfer, Multitask and Meta-learning ]


We introduce kernels with random Fourier features in the meta-learning framework for few-shot learning. We propose meta variational random features (MetaVRF) to learn adaptive kernels for the base-learner, which is developed in a latent variable model by treating the random feature basis as the latent variable. We formulate the optimization of MetaVRF as a variational inference problem by deriving an evidence lower bound under the meta-learning framework. To incorporate shared knowledge from related tasks, we propose a context inference of the posterior, which is established by an LSTM architecture. The LSTM-based inference network can effectively integrate the context information of previous tasks with task-specific information, generating informative and adaptive features. The learned MetaVRF can produce kernels of high representational power with a relatively low spectral sampling rate and also enables fast adaptation to new tasks. Experimental results on a variety of few-shot regression and classification tasks demonstrate that MetaVRF delivers much better, or at least competitive, performance compared to existing meta-learning alternatives.

Chat is not available.