Timezone: »
Leveraging on the convexity of the Lasso problem, screening rules help in accelerating solvers by discarding irrelevant variables, during the optimization process. However, because they provide better theoretical guarantees in identifying relevant variables, several non-convex regularizers for the Lasso have been proposed in the literature. This work is the first that introduces a screening rule strategy into a non-convex Lasso solver. The approach we propose is based on a iterative majorization-minimization (MM) strategy that includes a screening rule in the inner solver and a condition for propagating screened variables between iterations of MM. In addition to improve efficiency of solvers, we also provide guarantees that the inner solver is able to identify the zeros components of its critical point in finite time. Our experimental analysis illustrates the significant computational gain brought by the new screening rule compared to classical coordinate-descent or proximal gradient descent methods.
Author Information
alain rakotomamonjy (Universite de Rouen Normandie / Criteo AI Lab)
Gilles Gasso (INSA Rouen)
Joseph Salmon (Université de Montpellier)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Screening rules for Lasso with non-convex Sparse Regularizers »
Thu. Jun 13th 01:30 -- 04:00 AM Room Pacific Ballroom #190
More from the Same Authors
-
2022 : A Bias-Variance Analysis of Weight Averaging for OOD Generalization »
Alexandre Ramé · Matthieu Kirchmeyer · Thibaud J Rahier · Alain Rakotomamonjy · Patrick Gallinari · Matthieu Cord -
2023 Poster: Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG Signals »
Clément Bonet · Benoît Malézieux · alain rakotomamonjy · Lucas Drumetz · Thomas Moreau · Matthieu Kowalski · Nicolas Courty -
2023 Poster: Shedding a PAC-Bayesian Light on Adaptive Sliced-Wasserstein Distances »
Ruben Ohana · Kimia Nadjahi · alain rakotomamonjy · Ralaivola Liva -
2022 Poster: Bregman Neural Networks »
Jordan Frecon · Gilles Gasso · Massimiliano Pontil · Saverio Salzo -
2022 Spotlight: Bregman Neural Networks »
Jordan Frecon · Gilles Gasso · Massimiliano Pontil · Saverio Salzo -
2022 Poster: Stochastic smoothing of the top-K calibrated hinge loss for deep imbalanced classification »
Camille Garcin · Maximilien Servajean · Alexis Joly · Joseph Salmon -
2022 Poster: Differentially Private Coordinate Descent for Composite Empirical Risk Minimization »
Paul Mangold · Aurélien Bellet · Joseph Salmon · Marc Tommasi -
2022 Spotlight: Differentially Private Coordinate Descent for Composite Empirical Risk Minimization »
Paul Mangold · Aurélien Bellet · Joseph Salmon · Marc Tommasi -
2022 Spotlight: Stochastic smoothing of the top-K calibrated hinge loss for deep imbalanced classification »
Camille Garcin · Maximilien Servajean · Alexis Joly · Joseph Salmon -
2022 Poster: Generalizing to New Physical Systems via Context-Informed Dynamics Model »
Matthieu Kirchmeyer · Yuan Yin · Jérémie DONA · Nicolas Baskiotis · alain rakotomamonjy · Patrick Gallinari -
2022 Spotlight: Generalizing to New Physical Systems via Context-Informed Dynamics Model »
Matthieu Kirchmeyer · Yuan Yin · Jérémie DONA · Nicolas Baskiotis · alain rakotomamonjy · Patrick Gallinari -
2021 Poster: Differentially Private Sliced Wasserstein Distance »
alain rakotomamonjy · Ralaivola Liva -
2021 Oral: Differentially Private Sliced Wasserstein Distance »
alain rakotomamonjy · Ralaivola Liva -
2020 Poster: Implicit differentiation of Lasso-type models for hyperparameter optimization »
Quentin Bertrand · Quentin Klopfenstein · Mathieu Blondel · Samuel Vaiter · Alexandre Gramfort · Joseph Salmon -
2020 Poster: Partial Trace Regression and Low-Rank Kraus Decomposition »
Hachem Kadri · Stephane Ayache · Riikka Huusari · alain rakotomamonjy · Ralaivola Liva -
2019 Poster: Optimal Mini-Batch and Step Sizes for SAGA »
Nidham Gazagnadou · Robert Gower · Joseph Salmon -
2019 Oral: Optimal Mini-Batch and Step Sizes for SAGA »
Nidham Gazagnadou · Robert Gower · Joseph Salmon -
2019 Poster: Safe Grid Search with Optimal Complexity »
Eugene Ndiaye · Tam Le · Olivier Fercoq · Joseph Salmon · Ichiro Takeuchi -
2019 Oral: Safe Grid Search with Optimal Complexity »
Eugene Ndiaye · Tam Le · Olivier Fercoq · Joseph Salmon · Ichiro Takeuchi