Skip to yearly menu bar Skip to main content

Workshop: Over-parameterization: Pitfalls and Opportunities

Double Descent in Feature Selection: Revisiting LASSO and Basis Pursuit

David Bosch · Ashkan Panahi · Ayca Ozcelikkale

Abstract: We present a novel analysis of feature selection in linear models by the convex framework of least absolute shrinkage operator (LASSO) and basis pursuit (BP). Our analysis pertains to a general overparametrized scenario. When the numbers of the features and the data samples grow proportionally, we obtain precise expressions for the asymptotic generalization error of LASSO and BP. Considering a mixture of strong and weak features, we provide insights into regularization trade-offs for double descent for $\ell_1$ norm minimization. We validate these results with numerical experiments.