Skip to yearly menu bar Skip to main content


One-shot Distributed Ridge Regression in High Dimensions

Yue Sheng · Edgar Dobriban

Keywords: [ Large Scale Learning and Big Data ] [ Parallel and Distributed Learning ] [ Supervised Learning ] [ Spectral Methods ] [ Optimization - Large Scale, Parallel and Distributed ]


To scale up data analysis, distributed and parallel computing approaches are increasingly needed. Here we study a fundamental problem in this area: How to do ridge regression in a distributed computing environment? We study one-shot methods constructing weighted combinations of ridge regression estimators computed on each machine. By analyzing the mean squared error in a high dimensional model where each predictor has a small effect, we discover several new phenomena including that the efficiency depends strongly on the signal strength, but does not degrade with many workers, the risk decouples over machines, and the unexpected consequence that the optimal weights do not sum to unity. We also propose a new optimally weighted one-shot ridge regression algorithm. Our results are supported by simulations and real data analysis.

Chat is not available.