Tractable Expected Information Gains for Exponential Family Posteriors
Rik Knowles ⋅ Tom Rainforth
Abstract
We investigate which models admit a collapse of the expected information gain (EIG) and its derivative from a doubly intractable to a singly intractable expression. We prove that a sufficient condition is that the posterior distribution belongs to an exponential family (EF) that depends on the experimental design and data only through its natural parameters, and derive corresponding singly intractable and unbiased estimators for the $\operatorname{EIG}$ and its (reparameterised) gradient. We further derive necessary conditions on the likelihood to obtain an EF posterior of the required form, showing that this does not necessarily require the prior to be conjugate. This is complemented by a theoretical analysis of certain degenerate behaviors that may arise when optimizing the $\operatorname{EIG}$ for EF-modeled experiments. Finally, we empirically demonstrate the benefits of our singly intractable estimators, showing substantial performance gains over standard nested estimators.
Successful Page Load