Oral
Automatic Posterior Transformation for Likelihood-Free Inference
David Greenberg · Marcel Nonnenmacher · Jakob Macke

Tue Jun 11th 05:10 -- 05:15 PM @ Room 101

How can one perform Bayesian inference on stochastic simulators with intractable likelihoods? A recent approach is to learn the posterior from adaptively proposed simulations using neural-network based conditional density estimators. However, existing methods are limited to a narrow range of proposal distributions or require importance-weighting that can limit performance in practice. Here we present automatic posterior transformation (APT), a new approach for simulation-based inference via neural posterior estimation. APT is able to modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators. We show that APT is more flexible, scalable and efficient than previous simulation-based inference techniques and can directly learn informative features from high-dimensional and time series data.

Author Information

David Greenberg (Technical University of Munich)
Marcel Nonnenmacher (Technical University of Munich)
Jakob Macke (Technical University of Munich)

Related Events (a corresponding poster, oral, or spotlight)