Timezone: »

Hierarchical Few-Shot Generative Models
Giorgio Giannone · Ole Winther

@ None #None

A few-shot generative model should be able to generate data from a distribution by only observing a limited set of examples. In few-shot learning the model is trained on data from many sets from different distributions sharing some underlying properties such as sets of characters from different alphabets or sets of images of different type objects. We extend current latent variable models for sets to a fully hierarchical approach with an attention-based point to set-level aggregation and call our approach \emph{SCHA-VAE} for Set-Context-Hierarchical-Aggregation Variational Autoencoder. We explore iterative data sampling, likelihood-based model comparison, and adaptation-free out-of-distribution generalization. Our results show that the hierarchical formulation better captures the intrinsic variability within the sets in the small data regime. With this work we generalize deep latent variable approaches to few-shot learning, taking a step towards large-scale few-shot generation with a formulation that readily can work with current state-of-the-art deep generative models.

Author Information

Giorgio Giannone (Technical University of Denmark (DTU))
Ole Winther (DTU and KU)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors