Skip to yearly menu bar Skip to main content


A General Recipe for Likelihood-free Bayesian Optimization

Jiaming Song · Lantao Yu · Willie Neiswanger · Stefano Ermon

Hall E #734

Keywords: [ PM: Bayesian Models and Methods ] [ OPT: Global Optimization ] [ OPT: Learning for Optimization ] [ APP: Everything Else ] [ OPT: Zero-order and Black-box Optimization ]


The acquisition function, a critical component in Bayesian optimization (BO), can often be written as the expectation of a utility function under a surrogate model. However, to ensure that acquisition functions are tractable to optimize, restrictions must be placed on the surrogate model and utility function. To extend BO to a broader class of models and utilities, we propose likelihood-free BO (LFBO), an approach based on likelihood-free inference. LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model. We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem, which extends an existing likelihood-free density ratio estimation method related to probability of improvement (PI). By choosing the utility function for expected improvement (EI), LFBO outperforms the aforementioned method, as well as various state-of-the-art black-box optimization methods on several real-world optimization problems. LFBO can also leverage composite structures of the objective function, which further improves its regret by several orders of magnitude.

Chat is not available.