Poster

SCHA-VAE: Hierarchical Context Aggregation for Few-Shot Generation

Giorgio Giannone · Ole Winther

Hall E #404

Keywords: [ MISC: Unsupervised and Semi-supervised Learning ] [ MISC: Transfer, Multitask and Meta-learning ] [ PM: Variational Inference ] [ DL: Attention Mechanisms ] [ DL: Generative Models and Autoencoders ]

[ Abstract ]
[ Poster [ Paper PDF
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT
 
Spotlight presentation: DL: Robustness
Wed 20 Jul 10:15 a.m. PDT — 11:45 a.m. PDT

Abstract:

A few-shot generative model should be able to generate data from a novel distribution by only observing a limited set of examples. In few-shot learning the model is trained on data from many sets from distributions sharing some underlying properties such as sets of characters from different alphabets or objects from different categories. We extend current latent variable models for sets to a fully hierarchical approach with an attention-based point to set-level aggregation and call our method SCHA-VAE for Set-Context-Hierarchical-Aggregation Variational Autoencoder. We explore likelihood-based model comparison, iterative data sampling, and adaptation-free out-of-distribution generalization. Our results show that the hierarchical formulation better captures the intrinsic variability within the sets in the small data regime. This work generalizes deep latent variable approaches to few-shot learning, taking a step toward large-scale few-shot generation with a formulation that readily works with current state-of-the-art deep generative models.

Chat is not available.