Skip to yearly menu bar Skip to main content


Poster
in
Workshop: The Synergy of Scientific and Machine Learning Modelling (SynS & ML) Workshop

Simulation-based Inference with the Generalized Kullback-Leibler Divergence

Benjamin Kurt Miller · Marco Federici · Christoph Weniger · Patrick Forré

Keywords: [ unnormalized distribution estimation ] [ generalized energy-based model ] [ likelihood-free inference ] [ simulation-based inference ] [ implicit likelihood ] [ hybrid model ]


Abstract:

In Simulation-based Inference, the goal is to solve the inverse problem when the likelihood is only known implicitly. Fitting a normalized density estimator to act as a surrogate model for the posterior is known as Neural Posterior Estimation. Its current form cannot fit unnormalized surrogates because it optimizes the Kullback-Leibler divergence. We propose to (1) optimize a generalized Kullback-Leibler divergence that accounts for the normalization constant for unnormalized distributions (2) The objective recovers Neural Posterior Estimation when the model class is normalized and (3) unifies it with Neural Ratio Estimation, combining both with a single objective. (4) We investigate a hybrid model, offering the best of both worlds by learning a normalized base distribution with a learned ratio, and (5) present benchmark results.

Chat is not available.