Timezone: »
In a standard setting of Bayesian optimization (BO), the objective function evaluation is assumed to be highly expensive. Multi-fidelity Bayesian optimization (MFBO) accelerates BO by incorporating lower fidelity observations available with a lower sampling cost. We propose a novel information-theoretic approach to MFBO, called multi-fidelity max-value entropy search (MF-MES), that enables us to obtain a more reliable evaluation of the information gain compared with existing information-based methods for MFBO. Further, we also propose a parallelization of MF-MES mainly for the asynchronous setting because queries typically occur asynchronously in MFBO due to a variety of sampling costs. We show that most of computations in our acquisition functions can be derived analytically, except for at most only two dimensional numerical integration that can be performed efficiently by simple approximations. We demonstrate effectiveness of our approach by using benchmark datasets and a real-world application to materials science data.
Author Information
Shion Takeno (Nagoya Institute of Technology)
Hitoshi Fukuoka (Hitachi Metals, Ltd.)
Yuhki Tsukada (Nagoya University)
Toshiyuki Koyama (Nagoya University)
Motoki Shiga (Gifu University)
Ichiro Takeuchi (Nagoya Institute of Technology / RIKEN)
Masayuki Karasuyama (Nagoya Institute of Technology)
More from the Same Authors
-
2023 Poster: Randomized Gaussian Process Upper Confidence Bound with Tighter Bayesian Regret Bounds »
Shion Takeno · Yu Inatsu · Masayuki Karasuyama -
2023 Poster: Towards Practical Preferential Bayesian Optimization with Skew Gaussian Processes »
Shion Takeno · Masahiro Nomura · Masayuki Karasuyama -
2022 Poster: Sequential- and Parallel- Constrained Max-value Entropy Search via Information Lower Bound »
Shion Takeno · Tomoyuki Tamura · Kazuki Shitara · Masayuki Karasuyama -
2022 Spotlight: Sequential- and Parallel- Constrained Max-value Entropy Search via Information Lower Bound »
Shion Takeno · Tomoyuki Tamura · Kazuki Shitara · Masayuki Karasuyama -
2022 Poster: Bayesian Optimization for Distributionally Robust Chance-constrained Problem »
Yu Inatsu · Shion Takeno · Masayuki Karasuyama · Ichiro Takeuchi -
2022 Spotlight: Bayesian Optimization for Distributionally Robust Chance-constrained Problem »
Yu Inatsu · Shion Takeno · Masayuki Karasuyama · Ichiro Takeuchi -
2021 Poster: Active Learning for Distributionally Robust Level-Set Estimation »
Yu Inatsu · Shogo Iwazaki · Ichiro Takeuchi -
2021 Spotlight: Active Learning for Distributionally Robust Level-Set Estimation »
Yu Inatsu · Shogo Iwazaki · Ichiro Takeuchi -
2021 Poster: More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method »
Kazuya Sugiyama · Vo Nguyen Le Duy · Ichiro Takeuchi -
2021 Spotlight: More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method »
Kazuya Sugiyama · Vo Nguyen Le Duy · Ichiro Takeuchi -
2020 Poster: Multi-objective Bayesian Optimization using Pareto-frontier Entropy »
Shinya Suzuki · Shion Takeno · Tomoyuki Tamura · Kazuki Shitara · Masayuki Karasuyama -
2019 Poster: Safe Grid Search with Optimal Complexity »
Eugene Ndiaye · Tam Le · Olivier Fercoq · Joseph Salmon · Ichiro Takeuchi -
2019 Oral: Safe Grid Search with Optimal Complexity »
Eugene Ndiaye · Tam Le · Olivier Fercoq · Joseph Salmon · Ichiro Takeuchi -
2017 Poster: Selective Inference for Sparse High-Order Interaction Models »
Shinya Suzumura · Kazuya Nakagawa · Yuta Umezu · Koji Tsuda · Ichiro Takeuchi -
2017 Talk: Selective Inference for Sparse High-Order Interaction Models »
Shinya Suzumura · Kazuya Nakagawa · Yuta Umezu · Koji Tsuda · Ichiro Takeuchi