How can one perform Bayesian inference on stochastic simulators with intractable likelihoods? A recent approach is to learn the posterior from adaptively proposed simulations using neural-network based conditional density estimators. However, existing methods are limited to a narrow range of proposal distributions or require importance-weighting that can limit performance in practice. Here we present automatic posterior transformation (APT), a new approach for simulation-based inference via neural posterior estimation. APT is able to modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators. We show that APT is more flexible, scalable and efficient than previous simulation-based inference techniques and can directly learn informative features from high-dimensional and time series data.
David Greenberg (Technical University of Munich)
Marcel Nonnenmacher (Technical University of Munich)
Jakob Macke (Technical University of Munich)
Related Events (a corresponding poster, oral, or spotlight)
2019 Poster: Automatic Posterior Transformation for Likelihood-Free Inference »
Wed Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom