We propose a transductive Laplacian-regularized inference for few-shot tasks. Given any feature embedding learned from the base classes, we minimize a quadratic binary-assignment function containing two terms: (1) a unary term assigning query samples to the nearest class prototype, and (2) a pairwise Laplacian term encouraging nearby query samples to have consistent label assignments. Our transductive inference does not re-train the base model, and can be viewed as a graph clustering of the query set, subject to supervision constraints from the support set. We derive a computationally efficient bound optimizer of a relaxation of our function, which computes independent (parallel) updates for each query sample, while guaranteeing convergence. Following a simple cross-entropy training on the base classes, and without complex meta-learning strategies, we conducted comprehensive experiments over five few-shot learning benchmarks. Our LaplacianShot consistently outperforms state-of-the-art methods by significant margins across different models, settings, and data sets. Furthermore, our transductive inference is very fast, with computational times that are close to inductive inference, and can be used for large-scale few-shot tasks.
Imtiaz Ziko (ETS Montreal)
Jose Dolz (ETS Montreal)
Eric Granger (ETS Montreal)
Eric Granger received the Ph.D. degree in EE from Ecole Polytechnique de Montreal in 2001. He was a Defense Scientist with DRDC, Ottawa, from 1999 to 2001, and in R&D with Mitel Networks from 2001 to 2004. He joined the Dept. of Systems Engineering at Ecole de technologie supérieure, Université du Québec, Montreal, Canada, in 2004, where he is currently a Full Professor and the Director of LIVIA, a research laboratory focused on computer vision and artificial intelligence. His research interests include pattern recognition, machine learning, computer vision, and computational intelligence, with applications in affective computing, biometrics, face recognition, medical image analysis, and video surveillance.