Poster
in
Workshop: The Synergy of Scientific and Machine Learning Modelling (SynS & ML) Workshop
Simulation-based Inference with the Generalized Kullback-Leibler Divergence
Benjamin Kurt Miller · Marco Federici · Christoph Weniger · Patrick Forré
Keywords: [ hybrid model ] [ implicit likelihood ] [ simulation-based inference ] [ likelihood-free inference ] [ generalized energy-based model ] [ unnormalized distribution estimation ]
In Simulation-based Inference, the goal is to solve the inverse problem when the likelihood is only known implicitly. Fitting a normalized density estimator to act as a surrogate model for the posterior is known as Neural Posterior Estimation. Its current form cannot fit unnormalized surrogates because it optimizes the Kullback-Leibler divergence. We propose to (1) optimize a generalized Kullback-Leibler divergence that accounts for the normalization constant for unnormalized distributions (2) The objective recovers Neural Posterior Estimation when the model class is normalized and (3) unifies it with Neural Ratio Estimation, combining both with a single objective. (4) We investigate a hybrid model, offering the best of both worlds by learning a normalized base distribution with a learned ratio, and (5) present benchmark results.