Abstract:
Given a matrix A∈Rn×d and a vector b∈Rn, we consider the regression problem with ℓ∞ guarantees: finding a vector x′∈Rd such that ||x′−x∗||∞≤ϵ√d⋅||Ax∗−b||2⋅||A†|| with x∗ being the optimal solution to the regression ||Ax−b||2. One popular approach for solving ℓ2 regression problem is via sketching: picking a structured random matrix S∈Rm×n with m≪n and SA can be quickly computed, solve the sketched'' regression problem x′=argmin||SAx−Sb||2. In this paper, we show that in order to obtain such ℓ∞ guarantee for ℓ2 regression, one has to use sketching matrices that are *dense*. To the best of our knowledge, this is the first user case in which dense sketching matrices are necessary. On the algorithmic side, we prove that, there exists a distribution of dense sketching matrices with m=ϵ−2dlog3(n/δ) such that solving the sketched regression problem gives the ℓ∞ guarantee, with probability at least 1−δ. Moreover, the matrix SA can be computed in time O(ndlogn). Our row count is nearly-optimal up to logarithmic factors, and significantly improves the result in [Price, Song and Woodruff, ICALP'17], in which m=Ω(ϵ−2d1+γ) for γ∈(0,1) is required. Moreover, we develop a novel analytical framework for ℓ∞ guarantee regression that utilizes the *Oblivious Coordinate-wise Embedding* (OCE) property introduced in [Song and Yu, ICML'21]. Our analysis is much simpler and more general than that of [Price, Song and Woodruff, ICALP'17]. Leveraging this framework, we extend the ℓ∞ guarantee regression result to dense sketching matrices for computing fast tensor product of vectors.
Chat is not available.