Skip to yearly menu bar Skip to main content


Poster

MOKD: Cross-domain Few-shot Classification via Maximizing Optimized Kernel Dependence

Hongduan Tian · Feng Liu · Tongliang Liu · Bo Du · Yiu-ming Cheung · Bo Han


Abstract:

In cross-domain few-shot classification, nearest centroid classifier (NCC) aims to learn representations to construct a metric space where few-shot classification can be performed by measuring the similarities between samples and the prototype of each class. An intuition behind NCC is that each sample is pulled closer to the class centroid it belongs to while pushed away from other classes. However, in this paper, we find that there exist high similarities between NCC-learned representations of two samples from different classes. In order to solve this problem, we propose a bi-level optimization framework, maximizing optimized kernel dependence (MOKD) to learn a set of class-specific representations that match the cluster structures of labeled data in the given set. Specifically, MOKD first optimizes the kernel used in Hilbert-Schmidt independence criterion (HSIC) to obtain the optimized kernel HSIC (opt-HSIC) that can capture the dependence better. Then, an optimization problem regarding the opt-HSIC is addressed to simultaneously maximize the dependence between representations and labels and minimize the dependence among all samples. Extensive experiments on representative Meta-Dataset benchmark demonstrate that MOKD can not only achieve better generalization performance on unseen domains in most cases but also learn better data clusters for each class.

Live content is unavailable. Log in and register to view live content