Timezone: »
Ridge regression is a variant of regularized least squares regression that is particularly suitable in settings where the number of predictor variables greatly exceeds the number of observations. We present a simple, iterative, sketching-based algorithm for ridge regression that guarantees high-quality approximations to the optimal solution vector. Our analysis builds upon two simple structural results that boil down to randomized matrix multiplication, a fundamental and well-understood primitive of randomized linear algebra. An important contribution of our work is the analysis of the behavior of subsampled ridge regression problems when the ridge leverage scores are used: we prove that accurate approximations can be achieved by a sample whose size depends on the degrees of freedom of the ridge-regression problem rather than the dimensions of the design matrix. Our experimental evaluations verify our theoretical results on both real and synthetic data.
Author Information
Agniva Chowdhury (Purdue University)
Jiasen Yang (Purdue University)
Petros Drineas (Purdue University)
http://www.drineas.org
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: An Iterative, Sketching-based Framework for Ridge Regression »
Fri. Jul 13th 04:15 -- 07:00 PM Room Hall B #31
More from the Same Authors
-
2022 Poster: On the Convergence of Inexact Predictor-Corrector Methods for Linear Programming »
Gregory Dexter · Agniva Chowdhury · Haim Avron · Petros Drineas -
2022 Oral: On the Convergence of Inexact Predictor-Corrector Methods for Linear Programming »
Gregory Dexter · Agniva Chowdhury · Haim Avron · Petros Drineas -
2018 Poster: Goodness-of-fit Testing for Discrete Distributions via Stein Discrepancy »
Jiasen Yang · Qiang Liu · Vinayak A Rao · Jennifer Neville -
2018 Oral: Goodness-of-fit Testing for Discrete Distributions via Stein Discrepancy »
Jiasen Yang · Qiang Liu · Vinayak A Rao · Jennifer Neville