Skip to yearly menu bar Skip to main content


Poster

Amortised Learning by Wake-Sleep

Li Kevin Wenliang · Theodore Moskovitz · Heishiro Kanagawa · Maneesh Sahani

Virtual

Keywords: [ Unsupervised and Semi-supervised Learning ] [ Unsupervised Learning ] [ Graphical Models ] [ Generative Models ]


Abstract:

Models that employ latent variables to capture structure in observed data lie at the heart of many current unsupervised learning algorithms, but exact maximum-likelihood learning for powerful and flexible latent-variable models is almost always intractable. Thus, state-of-the-art approaches either abandon the maximum-likelihood framework entirely, or else rely on a variety of variational approximations to the posterior distribution over the latents. Here, we propose an alternative approach that we call amortised learning. Rather than computing an approximation to the posterior over latents, we use a wake-sleep Monte-Carlo strategy to learn a function that directly estimates the maximum-likelihood parameter updates. Amortised learning is possible whenever samples of latents and observations can be simulated from the generative model, treating the model as a ``black box''. We demonstrate its effectiveness on a wide range of complex models, including those with latents that are discrete or supported on non-Euclidean spaces.

Chat is not available.