Timezone: »
Amortized inference allows latent-variable models trained via variational learning to scale to large datasets. The quality of approximate inference is determined by two factors: a) the capacity of the variational distribution to match the true posterior and b) the ability of the recognition network to produce good variational parameters for each datapoint. We examine approximate inference in variational autoencoders in terms of these factors. We find that divergence from the true posterior is often due to imperfect recognition networks, rather than the limited complexity of the approximating distribution. We show that this is due partly to the generator learning to accommodate the choice of approximation. Furthermore, we show that the parameters used to increase the expressiveness of the approximation play a role in generalizing inference rather than simply improving the complexity of the approximation.
Author Information
Chris Cremer (University of Toronto)
Xuechen Li (University of Toronto)
David Duvenaud (University of Toronto)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: Inference Suboptimality in Variational Autoencoders »
Wed Jul 11th 04:15 -- 07:00 PM Room Hall B
More from the Same Authors
-
2020 Workshop: INNF+: Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models »
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Chris Cremer · Tian Qi Chen · Danilo J. Rezende -
2020 Poster: Learning the Stein Discrepancy for Training and Evaluating Energy-Based Models without Sampling »
Will Grathwohl · Kuan-Chieh Wang · Joern-Henrik Jacobsen · David Duvenaud · Richard Zemel -
2019 Workshop: Invertible Neural Networks and Normalizing Flows »
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Aidan Gomez · Chris Cremer · Aaron Courville · Tian Qi Chen · Danilo J. Rezende -
2019 Poster: Invertible Residual Networks »
Jens Behrmann · Will Grathwohl · Tian Qi Chen · David Duvenaud · Joern-Henrik Jacobsen -
2019 Oral: Invertible Residual Networks »
Jens Behrmann · Will Grathwohl · Tian Qi Chen · David Duvenaud · Joern-Henrik Jacobsen -
2018 Poster: Noisy Natural Gradient as Variational Inference »
Guodong Zhang · Shengyang Sun · David Duvenaud · Roger Grosse -
2018 Oral: Noisy Natural Gradient as Variational Inference »
Guodong Zhang · Shengyang Sun · David Duvenaud · Roger Grosse