Sparse Regression with $\ell_0$ Constraints for $\alpha$-Mixing Time Series: Algorithms and Guarantees
Ruoxin Yuan ⋅ Lijun Ding
Abstract
Exact sparse methods based on $\ell_0$ constraints are increasingly used for interpretable and scalable time series modeling, where one aims to recover a small set of informative lags/factors while maintaining strong predictive performance and low computational cost. Despite their empirical success, finite-sample and computational guarantees for such methods under temporal dependence remain limited. In this paper, we study $\ell_0$-constrained least squares for time series generated by $\alpha$-mixing stationary Gaussian processes with sparse coefficients. We establish high-probability restricted strong convexity/smoothness (RSC/RSS) for the empirical quadratic loss. Leveraging these conditions, we derive nonasymptotic statistical guarantees and computational complexities for a series of exact sparse methods, including iterative hard thresholding (IHT). We apply our theoretical results to Gaussian vector autoregressive (VAR) models and obtain new guarantees. Experiments on synthetic sparse VAR models and real-world mobility time series demonstrate that exact sparse methods recover lag structure more accurately and interpretably than some classical methods, while achieving comparable prediction error with substantially lower computational cost.
Successful Page Load