Skip to yearly menu bar Skip to main content


Spotlight

More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method

Kazuya Sugiyama · Vo Nguyen Le Duy · Ichiro Takeuchi

[ ] [ Livestream: Visit Supervised Learning 5 ] [ Paper ]
[ Paper ]

Abstract:

Conditional selective inference (SI) has been actively studied as a new statistical inference framework for data-driven hypotheses. The basic idea of conditional SI is to make inferences conditional on the selection event characterized by a set of linear and/or quadratic inequalities. Conditional SI has been mainly studied in the context of feature selection such as stepwise feature selection (SFS). The main limitation of the existing conditional SI methods is the loss of power due to over-conditioning, which is required for computational tractability. In this study, we develop a more powerful and general conditional SI method for SFS using the homotopy method which enables us to overcome this limitation. The homotopy-based SI is especially effective for more complicated feature selection algorithms. As an example, we develop a conditional SI method for forward-backward SFS with AIC-based stopping criteria and show that it is not adversely affected by the increased complexity of the algorithm. We conduct several experiments to demonstrate the effectiveness and efficiency of the proposed method.

Chat is not available.