Poster

A General Recipe for Likelihood-free Bayesian Optimization

Jiaming Song · Lantao Yu · Willie Neiswanger · Stefano Ermon

Hall E #734

Keywords: [ APP: Everything Else ] [ OPT: Global Optimization ] [ PM: Bayesian Models and Methods ] [ OPT: Learning for Optimization ] [ OPT: Zero-order and Black-box Optimization ]

[ Abstract ]
[ Poster [ Paper PDF
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT
 
Oral presentation: Deep Learning/Optimization
Wed 20 Jul 1:30 p.m. PDT — 3 p.m. PDT

Abstract:

The acquisition function, a critical component in Bayesian optimization (BO), can often be written as the expectation of a utility function under a surrogate model. However, to ensure that acquisition functions are tractable to optimize, restrictions must be placed on the surrogate model and utility function. To extend BO to a broader class of models and utilities, we propose likelihood-free BO (LFBO), an approach based on likelihood-free inference. LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model. We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem, which extends an existing likelihood-free density ratio estimation method related to probability of improvement (PI). By choosing the utility function for expected improvement (EI), LFBO outperforms the aforementioned method, as well as various state-of-the-art black-box optimization methods on several real-world optimization problems. LFBO can also leverage composite structures of the objective function, which further improves its regret by several orders of magnitude.

Chat is not available.