Skip to yearly menu bar Skip to main content


Spotlight Poster

Distributed High-Dimensional Quantile Regression: Estimation Efficiency and Support Recovery

Caixing Wang · Ziliang Shen

Hall C 4-9 #2204
[ ] [ Project Page ] [ Paper PDF ]
[ Slides [ Poster
Tue 23 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract:

In this paper, we focus on distributed estimation and support recovery for high-dimensional linear quantile regression. Quantile regression is a popular alternative tool to the least squares regression for robustness against outliers and data heterogeneity. However, the non-smoothness of the check loss function poses big challenges to both computation and theory in the distributed setting. To tackle these problems, we transform the original quantile regression into the least-squares optimization. By applying a double-smoothing approach, we extend a previous Newton-type distributed approach without the restrictive independent assumption between the error term and covariates. An efficient algorithm is developed, which enjoys high computation and communication efficiency. Theoretically, the proposed distributed estimator achieves a near-oracle convergence rate and high support recovery accuracy after a constant number of iterations. Extensive experiments on synthetic examples and a real data application further demonstrate the effectiveness of the proposed method.

Chat is not available.