Skip to yearly menu bar Skip to main content


Poster

Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization

Chengyue Gong · Jian Peng · Qiang Liu

Pacific Ballroom #239

Keywords: [ Optimization - Others ] [ Bayesian Methods ] [ Approximate Inference ]


Abstract:

Batch Bayesian optimization has been shown to be an efficient and successful approach for black-box function optimization, especially when the evaluation of cost function is highly expensive but can be efficiently parallelized. In this paper, we introduce a novel variational framework for batch query optimization, based on the argument that the query batch should be selected to have both high diversity and good worst case performance. This motivates us to introduce a variational objective that combines a quantile-based risk measure (for worst case performance) and entropy regularization (for enforcing diversity). We derive a gradient-based particle-based algorithm for solving our quantile-based variational objective, which generalizes Stein variational gradient descent (SVGD). We evaluate our method on a number of real-world applications and show that it consistently outperforms other recent state-of-the-art batch Bayesian optimization methods. Extensive experimental results indicate that our method achieves better or comparable performance, compared to the existing methods.

Live content is unavailable. Log in and register to view live content