Timezone: »

Automatic Posterior Transformation for Likelihood-Free Inference
David Greenberg · Marcel Nonnenmacher · Jakob Macke

Tue Jun 11 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #238

How can one perform Bayesian inference on stochastic simulators with intractable likelihoods? A recent approach is to learn the posterior from adaptively proposed simulations using neural network-based conditional density estimators. However, existing methods are limited to a narrow range of proposal distributions or require importance weighting that can limit performance in practice. Here we present automatic posterior transformation (APT), a new sequential neural posterior estimation method for simulation-based inference. APT can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators. It is more flexible, scalable and efficient than previous simulation-based inference techniques. APT can operate directly on high-dimensional time series and image data, opening up new applications for likelihood-free inference.

Author Information

David Greenberg (Technical University of Munich)
Marcel Nonnenmacher (Technical University of Munich)
Jakob Macke (Technical University of Munich)

Related Events (a corresponding poster, oral, or spotlight)