Handling previously unseen tasks after given only a few training examples continues to be a tough challenge in machine learning. We propose TapNets, a neural network augmented with task-adaptive projection for improved few-shot learning. Here, employing a meta-learning strategy with episode-based training, a network and a set of per-class reference vectors are learned slowly over widely varying tasks. At the same time, for every episode, features in the embedding space are linearly projected into a new space as a form of quick task-specific conditioning. Training loss is obtained based on a distance metric between the query and the reference vectors in the projection space. Excellent generalization results in this way. When tested on the Omniglot, miniImageNet and tieredImageNet datasets, we obtain state of the art classification accuracies under different few-shot scenarios.
Sung Whan Yoon (Korea Advanced Institute of Science and Technology (KAIST))
I am currently an assistant professor of School of Electrical and Computer Engineering at Ulsan National Institute of Science and Technology (UNIST) from Mar. 2020. I received my Ph.D. from School of Electrical Engineering at Korea Advanced Institute of Science and Technology (KAIST), in Aug. 2017, under supervision by Prof. Jaekyun Moon. Before joining UNIST, I was a postdoctoral researcher at KAIST from Sep. 2017 to Mar. 2020. My research interests are in the area of artificial intelligence, distributed system and information theory. Recently, I'm dedicated to meta-learning algorithms with balanced inductive bias and adaptation for efficient few-shot classification (result is published in ICML 2019 & NeurIPS workshop 2018). As a future work, I'm interested in designing the incremental/continual meta-learning, which captures new concept quickly and build the combined concept incrementally (an earlier work is accepted to ICML 2020). For an another direction, I hope to develop an intelligent system built on efficient learning algorithms with following features: being able to learn from big data (even noisy and non-annotated), leveraging distributed resources (storage, computation and communication) and being hardware-friendly.
Jun Seo (Korea Advanced Institute of Science and Technology(KAIST))
Jaekyun Moon (KAIST)
Related Events (a corresponding poster, oral, or spotlight)
2019 Poster: TapNet: Neural Network Augmented with Task-Adaptive Projection for Few-Shot Learning »
Wed Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom
More from the Same Authors
2020 Poster: XtarNet: Learning to Extract Task-Adaptive Representation for Incremental Few-Shot Learning »
Sung Whan Yoon · Do-Yeon Kim · Jun Seo · Jaekyun Moon