Timezone: »
Spotlight
Active Learning for Distributionally Robust LevelSet Estimation
Yu Inatsu · Shogo Iwazaki · Ichiro Takeuchi
Many cases exist in which a blackbox function $f$ with high evaluation cost depends on two types of variables $\bm x$ and $\bm w$, where $\bm x$ is a controllable \emph{design} variable and $\bm w$ are uncontrollable \emph{environmental} variables that have random variation following a certain distribution $P$. In such cases, an important task is to find the range of design variables $\bm x$ such that the function $f(\bm x, \bm w)$ has the desired properties by incorporating the random variation of the environmental variables $\bm w$. A natural measure of robustness is the probability that $f(\bm x, \bm w)$ exceeds a given threshold $h$, which is known as the \emph{probability threshold robustness} (PTR) measure in the literature on robust optimization. However, this robustness measure cannot be correctly evaluated when the distribution $P$ is unknown. In this study, we addressed this problem by considering the \textit{distributionally robust PTR} (DRPTR) measure, which considers the worstcase PTR within given candidate distributions. Specifically, we studied the problem of efficiently identifying a reliable set $H$, which is defined as a region in which the DRPTR measure exceeds a certain desired probability $\alpha$, which can be interpreted as a level set estimation (LSE) problem for DRPTR. We propose a theoretically grounded and computationally efficient active learning method for this problem. We show that the proposed method has theoretical guarantees on convergence and accuracy, and confirmed through numerical experiments that the proposed method outperforms existing methods.
Author Information
Yu Inatsu (Nagoya Institute of Technology)
Shogo Iwazaki (Nagoya Institute of Technology)
Ichiro Takeuchi (Nagoya Institute of Technology / RIKEN)
Related Events (a corresponding poster, oral, or spotlight)

2021 Poster: Active Learning for Distributionally Robust LevelSet Estimation »
Fri Jul 23rd 04:00  06:00 AM Room None
More from the Same Authors

2021 Poster: More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method »
Kazuya Sugiyama · Vo Nguyen Le Duy · Ichiro Takeuchi 
2021 Spotlight: More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method »
Kazuya Sugiyama · Vo Nguyen Le Duy · Ichiro Takeuchi 
2020 Poster: Multifidelity Bayesian Optimization with Maxvalue Entropy Search and its Parallelization »
Shion Takeno · Hitoshi Fukuoka · Yuhki Tsukada · Toshiyuki Koyama · Motoki Shiga · Ichiro Takeuchi · Masayuki Karasuyama 
2019 Poster: Safe Grid Search with Optimal Complexity »
Eugene Ndiaye · Tam Le · Olivier Fercoq · Joseph Salmon · Ichiro Takeuchi 
2019 Oral: Safe Grid Search with Optimal Complexity »
Eugene Ndiaye · Tam Le · Olivier Fercoq · Joseph Salmon · Ichiro Takeuchi 
2017 Poster: Selective Inference for Sparse HighOrder Interaction Models »
Shinya Suzumura · Kazuya Nakagawa · Yuta Umezu · Koji Tsuda · Ichiro Takeuchi 
2017 Talk: Selective Inference for Sparse HighOrder Interaction Models »
Shinya Suzumura · Kazuya Nakagawa · Yuta Umezu · Koji Tsuda · Ichiro Takeuchi