Timezone: »
Conditional selective inference (SI) has been actively studied as a new statistical inference framework for data-driven hypotheses. The basic idea of conditional SI is to make inferences conditional on the selection event characterized by a set of linear and/or quadratic inequalities. Conditional SI has been mainly studied in the context of feature selection such as stepwise feature selection (SFS). The main limitation of the existing conditional SI methods is the loss of power due to over-conditioning, which is required for computational tractability. In this study, we develop a more powerful and general conditional SI method for SFS using the homotopy method which enables us to overcome this limitation. The homotopy-based SI is especially effective for more complicated feature selection algorithms. As an example, we develop a conditional SI method for forward-backward SFS with AIC-based stopping criteria and show that it is not adversely affected by the increased complexity of the algorithm. We conduct several experiments to demonstrate the effectiveness and efficiency of the proposed method.
Author Information
Kazuya Sugiyama (Nagoya Institute of Technology)
Vo Nguyen Le Duy (Nagoya Institute of Technology / RIKEN)
Ichiro Takeuchi (Nagoya Institute of Technology / RIKEN)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Poster: More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method »
Thu. Jul 22nd 04:00 -- 06:00 AM Room None
More from the Same Authors
-
2022 Poster: Bayesian Optimization for Distributionally Robust Chance-constrained Problem »
Yu Inatsu · Shion Takeno · Masayuki Karasuyama · Ichiro Takeuchi -
2022 Spotlight: Bayesian Optimization for Distributionally Robust Chance-constrained Problem »
Yu Inatsu · Shion Takeno · Masayuki Karasuyama · Ichiro Takeuchi -
2021 Poster: Active Learning for Distributionally Robust Level-Set Estimation »
Yu Inatsu · Shogo Iwazaki · Ichiro Takeuchi -
2021 Spotlight: Active Learning for Distributionally Robust Level-Set Estimation »
Yu Inatsu · Shogo Iwazaki · Ichiro Takeuchi -
2020 Poster: Multi-fidelity Bayesian Optimization with Max-value Entropy Search and its Parallelization »
Shion Takeno · Hitoshi Fukuoka · Yuhki Tsukada · Toshiyuki Koyama · Motoki Shiga · Ichiro Takeuchi · Masayuki Karasuyama -
2019 Poster: Safe Grid Search with Optimal Complexity »
Eugene Ndiaye · Tam Le · Olivier Fercoq · Joseph Salmon · Ichiro Takeuchi -
2019 Oral: Safe Grid Search with Optimal Complexity »
Eugene Ndiaye · Tam Le · Olivier Fercoq · Joseph Salmon · Ichiro Takeuchi -
2017 Poster: Selective Inference for Sparse High-Order Interaction Models »
Shinya Suzumura · Kazuya Nakagawa · Yuta Umezu · Koji Tsuda · Ichiro Takeuchi -
2017 Talk: Selective Inference for Sparse High-Order Interaction Models »
Shinya Suzumura · Kazuya Nakagawa · Yuta Umezu · Koji Tsuda · Ichiro Takeuchi