Timezone: »
Neural Posterior Estimation methods for simulation-based inference can be ill-suited for dealing with posterior distributions obtained by conditioning on multiple observations, as they tend to require a large number of simulator calls to learn accurate approximations. In contrast, Neural Likelihood Estimation methods can handle multiple observations at inference time after learning from individual observations, but they rely on standard inference methods, such as MCMC or variational inference, which come with certain performance drawbacks. We introduce a new method based on conditional score modeling that enjoys the benefits of both approaches. We model the scores of the (diffused) posterior distributions induced by individual observations, and introduce a way of combining the learned scores to approximately sample from the target posterior distribution. Our approach is sample-efficient, can naturally aggregate multiple observations at inference time, and avoids the drawbacks of standard inference methods.
Author Information
Tomas Geffner (University of Massachusetts, Amherst)
George Papamakarios (DeepMind)
Andriy Mnih (DeepMind)
More from the Same Authors
-
2021 Workshop: INNF+: Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models »
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Ricky T. Q. Chen · Danilo J. Rezende -
2021 Poster: Generalized Doubly Reparameterized Gradient Estimators »
Matthias Bauer · Andriy Mnih -
2021 Spotlight: Generalized Doubly Reparameterized Gradient Estimators »
Matthias Bauer · Andriy Mnih -
2021 Poster: The Lipschitz Constant of Self-Attention »
Hyunjik Kim · George Papamakarios · Andriy Mnih -
2021 Spotlight: The Lipschitz Constant of Self-Attention »
Hyunjik Kim · George Papamakarios · Andriy Mnih -
2020 Workshop: INNF+: Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models »
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Chris Cremer · Ricky T. Q. Chen · Danilo J. Rezende -
2020 Poster: On Contrastive Learning for Likelihood-free Inference »
Conor Durkan · Iain Murray · George Papamakarios -
2020 Poster: Normalizing Flows on Tori and Spheres »
Danilo J. Rezende · George Papamakarios · Sebastien Racaniere · Michael Albergo · Gurtej Kanwar · Phiala Shanahan · Kyle Cranmer -
2019 Workshop: Invertible Neural Networks and Normalizing Flows »
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Aidan Gomez · Chris Cremer · Aaron Courville · Ricky T. Q. Chen · Danilo J. Rezende -
2018 Poster: Disentangling by Factorising »
Hyunjik Kim · Andriy Mnih -
2018 Oral: Disentangling by Factorising »
Hyunjik Kim · Andriy Mnih