Timezone: »
Variational autoencoders (VAEs) have received considerable attention, since they allow us to learn expressive neural density estimators effectively and efficiently. However, learning and inference in VAEs is still problematic due to the sensitive interplay between the generative model and the inference network. Since these problems become generally more severe in high dimensions, we propose a novel hierarchical mixture model over low-dimensional VAE experts. Our model decomposes the overall learning problem into many smaller problems, which are coordinated by the hierarchical mixture, represented by a sum-product network. In experiments we show that our models outperform classical VAEs on almost all of our experimental benchmarks. Moreover, we show that our model is highly data efficient and degrades very gracefully in extremely low data regimes.ow data regimes.
Author Information
Ping Liang Tan (University of Cambridge)
Robert Peharz (University of Cambridge)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Oral: Hierarchical Decompositional Mixtures of Variational Autoencoders »
Wed Jun 12th 11:30 -- 11:35 PM Room Hall B
More from the Same Authors
-
2020 Poster: Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits »
Robert Peharz · Steven Lang · Antonio Vergari · Karl Stelzner · Alejandro Molina · Martin Trapp · Guy Van den Broeck · Kristian Kersting · Zoubin Ghahramani -
2019 Poster: Faster Attend-Infer-Repeat with Tractable Probabilistic Models »
Karl Stelzner · Robert Peharz · Kristian Kersting -
2019 Oral: Faster Attend-Infer-Repeat with Tractable Probabilistic Models »
Karl Stelzner · Robert Peharz · Kristian Kersting