Abstract:
Given a matrix $A\in \mathbb{R}^{n\times d}$ and a vector $b\in \mathbb{R}^n$, we consider the regression problem with $\ell_\infty$ guarantees: finding a vector $x'\in \mathbb{R}^d$ such that $||x'-x^* ||_\infty \leq \frac{\epsilon}{\sqrt{d}}\cdot ||Ax^*-b||_2\cdot ||A^\dagger||$ with $x^*$ being the optimal solution to the regression $||Ax-b||_2$. One popular approach for solving $\ell_2$ regression problem is via sketching: picking a structured random matrix $S\in \mathbb{R}^{m\times n}$ with $m\ll n$ and $SA$ can be quickly computed, solve the ``sketched'' regression problem $x'=\mathrm{argmin} ||SAx-Sb||_2$. In this paper, we show that in order to obtain such $\ell_\infty$ guarantee for $\ell_2$ regression, one has to use sketching matrices that are *dense*. To the best of our knowledge, this is the first user case in which dense sketching matrices are necessary. On the algorithmic side, we prove that, there exists a distribution of dense sketching matrices with $m=\epsilon^{-2}d\log^3(n/\delta)$ such that solving the sketched regression problem gives the $\ell_\infty$ guarantee, with probability at least $1-\delta$. Moreover, the matrix $SA$ can be computed in time $O(nd\log n)$. Our row count is nearly-optimal up to logarithmic factors, and significantly improves the result in [Price, Song and Woodruff, ICALP'17], in which $m=\Omega(\epsilon^{-2}d^{1+\gamma})$ for $\gamma\in (0, 1)$ is required. Moreover, we develop a novel analytical framework for $\ell_\infty$ guarantee regression that utilizes the *Oblivious Coordinate-wise Embedding* (OCE) property introduced in [Song and Yu, ICML'21]. Our analysis is much simpler and more general than that of [Price, Song and Woodruff, ICALP'17]. Leveraging this framework, we extend the $\ell_\infty$ guarantee regression result to dense sketching matrices for computing fast tensor product of vectors.
Chat is not available.