Robust Inference for High-Dimensional Linear Models via Residual Randomization

Y. Samuel Wang · Si Kai Lee · Panos Toulis · Mladen Kolar

Keywords: [ Statistical Learning Theory ]

[ Abstract ]
[ Paper ]
[ Visit Poster at Spot D3 in Virtual World ]
Wed 21 Jul 9 a.m. PDT — 11 a.m. PDT
Spotlight presentation: Learning Theory 2
Wed 21 Jul 5 a.m. PDT — 6 a.m. PDT


We propose a residual randomization procedure designed for robust inference using Lasso estimates in the high-dimensional setting. Compared to earlier work that focuses on sub-Gaussian errors, the proposed procedure is designed to work robustly in settings that also include heavy-tailed covariates and errors. Moreover, our procedure can be valid under clustered errors, which is important in practice, but has been largely overlooked by earlier work. Through extensive simulations, we illustrate our method's wider range of applicability as suggested by theory. In particular, we show that our method outperforms state-of-art methods in challenging, yet more realistic, settings where the distribution of covariates is heavy-tailed or the sample size is small, while it remains competitive in standard, ``well behaved" settings previously studied in the literature.

Chat is not available.