Timezone: »
Likelihood-free methods perform parameter inference in stochastic simulator models where evaluating the likelihood is intractable but sampling synthetic data is possible. One class of methods for this likelihood-free problem uses a classifier to distinguish between pairs of parameter-observation samples generated using the simulator and pairs sampled from some reference distribution, which implicitly learns a density ratio proportional to the likelihood. Another popular class of methods fits a conditional distribution to the parameter posterior directly, and a particular recent variant allows for the use of flexible neural density estimators for this task. In this work, we show that both of these approaches can be unified under a general contrastive learning scheme, and clarify how they should be run and compared.
Author Information
Conor Durkan (University of Edinburgh)
Iain Murray (University of Edinburgh)
Iain Murray is a SICSA Lecturer in Machine Learning at the University of Edinburgh. Iain was introduced to machine learning by David MacKay and Zoubin Ghahramani, both previous NIPS tutorial speakers. He obtained his PhD in 2007 from the Gatsby Computational Neuroscience Unit at UCL. His thesis on Monte Carlo methods received an honourable mention for the ISBA Savage Award. He was a commonwealth fellow in Machine Learning at the University of Toronto, before moving to Edinburgh in 2010. Iain's research interests include building flexible probabilistic models of data, and probabilistic inference from indirect and uncertain observations. Iain is passionate about teaching. He has lectured at several Summer schools, is listed in the top 15 authors on videolectures.net, and was awarded the EUSA Van Heyningen Award for Teaching in Science and Engineering in 2015.
George Papamakarios (DeepMind)
More from the Same Authors
-
2023 Poster: Compositional Score Modeling for Simulation-Based Inference »
Tomas Geffner · George Papamakarios · Andriy Mnih -
2021 Workshop: INNF+: Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models »
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Ricky T. Q. Chen · Danilo J. Rezende -
2021 Poster: The Lipschitz Constant of Self-Attention »
Hyunjik Kim · George Papamakarios · Andriy Mnih -
2021 Spotlight: The Lipschitz Constant of Self-Attention »
Hyunjik Kim · George Papamakarios · Andriy Mnih -
2020 Workshop: INNF+: Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models »
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Chris Cremer · Ricky T. Q. Chen · Danilo J. Rezende -
2020 Poster: Normalizing Flows on Tori and Spheres »
Danilo J. Rezende · George Papamakarios · Sebastien Racaniere · Michael Albergo · Gurtej Kanwar · Phiala Shanahan · Kyle Cranmer -
2019 Workshop: Invertible Neural Networks and Normalizing Flows »
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Aidan Gomez · Chris Cremer · Aaron Courville · Ricky T. Q. Chen · Danilo J. Rezende -
2019 Poster: Autoregressive Energy Machines »
Conor Durkan · Charlie Nash -
2019 Oral: Autoregressive Energy Machines »
Conor Durkan · Charlie Nash -
2019 Poster: BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning »
Asa Cooper Stickland · Iain Murray -
2019 Oral: BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning »
Asa Cooper Stickland · Iain Murray -
2018 Poster: Dynamic Evaluation of Neural Sequence Models »
Ben Krause · Emmanuel Kahembwe · Iain Murray · Steve Renals -
2018 Oral: Dynamic Evaluation of Neural Sequence Models »
Ben Krause · Emmanuel Kahembwe · Iain Murray · Steve Renals