Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: Subset Selection in Machine Learning: From Theory to Applications

High-Dimensional Variable Selection and Non-Linear Interaction Discovery in Linear Time

Raj Agrawal · Tamara Broderick


Abstract: Many scientific problems require identifying a small set of covariates that are associated with a target response and estimating their effects. Often, these effects are non-linear and include interactions, so linear and additive methods can lead to poor estimation and variable selection. The Bayesian framework makes it straightforward to simultaneously express sparsity, non-linearity, and interactions in a hierarchical model. But, as for the few other methods that handle this trifecta, inference is computationally intractable --- with runtime at least quadratic in the number of covariates, and often worse. In the present work, we solve this computational bottleneck. We first show that suitable Bayesian models can be represented as Gaussian processes (GPs). We then demonstrate how a kernel trick can reduce computation with these GPs to $O$(\# covariates) time for both variable selection and estimation. Our resulting fit corresponds to a sparse orthogonal decomposition of the regression function in a Hilbert space (i.e., a functional ANOVA decomposition), where interaction effects represent all variation that cannot be explained by lower-order effects. On a variety of synthetic and real datasets, our approach outperforms existing methods used for large, high-dimensional datasets while remaining competitive (or being orders of magnitude faster) in runtime.