Poster
in
Workshop: Beyond Bayes: Paths Towards Universal Reasoning Systems
P16: Bayesian Reasoning with Trained Neural Networks
Jakob Knollmüller
Authors: Jakob Knollmüller, Torsten Ensslin
Abstract: We show how to use trained neural networks to perform Bayesian reasoning in order to solve tasks outside their initial scope. Deep generative models provide prior knowledge, and classification/regression networks impose constraints. The tasks at hand are formulated as Bayesian inference problems, which we approximately solve through variational or sampling techniques. The approach builds on top of already trained networks, and the addressable questions grow super-exponentially with the number of available networks. In its simplest form, the approach yielded conditional generative models. However, multiple simultaneous constraints constitute elaborate questions. We compare the approach to specifically trained generators, show how to solve riddles, and demonstrate its compatibility with state-of-the-art architectures.