Sequential- and Parallel- Constrained Max-value Entropy Search via Information Lower Bound

Shion Takeno · Tomoyuki Tamura · Kazuki Shitara · Masayuki Karasuyama

Hall E #733

Keywords: [ MISC: Online Learning, Active Learning and Bandits ] [ PM: Bayesian Models and Methods ] [ PM: Gaussian Processes ] [ OPT: Zero-order and Black-box Optimization ]

[ Abstract ]
[ Poster [ Paper PDF
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT
Spotlight presentation: Deep Learning/Optimization
Wed 20 Jul 1:30 p.m. PDT — 3 p.m. PDT


Max-value entropy search (MES) is one of the state-of-the-art approaches in Bayesian optimization (BO). In this paper, we propose a novel variant of MES for constrained problems, called Constrained MES via Information lower BOund (CMES-IBO), that is based on a Monte Carlo (MC) estimator of a lower bound of a mutual information (MI). Unlike existing studies, our MI is defined so that uncertainty with respect to feasibility can be incorporated. We derive a lower bound of the MI that guarantees non-negativity, while a constrained counterpart of conventional MES can be negative. We further provide theoretical analysis that assures the low-variability of our estimator which has never been investigated for any existing information-theoretic BO. Moreover, using the conditional MI, we extend CMES-IBO to the parallel setting while maintaining the desirable properties. We demonstrate the effectiveness of CMES-IBO by several benchmark functions and real-world problems.

Chat is not available.