SketchEmbedNet: Learning Novel Concepts by Imitating Drawings

Alexander Wang · Mengye Ren · Richard Zemel

[ Abstract ] [ Livestream: Visit Representation Learning 3 ] [ Paper ]
Thu 22 Jul 7:25 a.m. — 7:30 a.m. PDT

Sketch drawings capture the salient information of visual concepts. Previous work has shown that neural networks are capable of producing sketches of natural objects drawn from a small number of classes. While earlier approaches focus on generation quality or retrieval, we explore properties of image representations learned by training a model to produce sketches of images. We show that this generative, class-agnostic model produces informative embeddings of images from novel examples, classes, and even novel datasets in a few-shot setting. Additionally, we find that these learned representations exhibit interesting structure and compositionality.

Chat is not available.