Multi-task Linear Regression without Eigenvalue Lower Bounds: Adaptivity, Robustness and Safety
Seok-Jin Kim
Abstract
We study the multi-task linear regression problem with contaminated tasks. We consider a situation where the unknown parameters of each task are close in the $\ell_2$-norm, but a certain proportion of tasks are outliers. In the presence of outliers, existing works develop theory under the assumption that the empirical second moment (normalized Gram matrix) of each task has a minimum eigenvalue of order $\Omega(1)$. However, this assumption is violated in many cases, and we propose a novel loss function that operates efficiently under a strictly relaxed assumption. Under this assumption, we obtain an optimal Mean Squared Error (MSE) bound, and even when the assumption is violated, we achieve a favorable rate of the MSE bound. Hence, our methodology adapts to the degree of task similarity and the proportion of outliers, both of which are unknown (adaptivity and robustness), and also enjoys safety against assumption violation.
Successful Page Load