Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

A Nearly-Optimal Bound for Fast Regression with Guarantee

Zhao Song · Mingquan Ye · Junze Yin · Lichen Zhang

Exhibit Hall 1 #334
[ ]
[ PDF [ Poster

Abstract: Given a matrix ARn×d and a vector bRn, we consider the regression problem with guarantees: finding a vector xRd such that ||xx||ϵd||Axb||2||A|| with x being the optimal solution to the regression ||Axb||2. One popular approach for solving 2 regression problem is via sketching: picking a structured random matrix SRm×n with mn and SA can be quickly computed, solve the sketched'' regression problem x=argmin||SAxSb||2. In this paper, we show that in order to obtain such guarantee for 2 regression, one has to use sketching matrices that are *dense*. To the best of our knowledge, this is the first user case in which dense sketching matrices are necessary. On the algorithmic side, we prove that, there exists a distribution of dense sketching matrices with m=ϵ2dlog3(n/δ) such that solving the sketched regression problem gives the guarantee, with probability at least 1δ. Moreover, the matrix SA can be computed in time O(ndlogn). Our row count is nearly-optimal up to logarithmic factors, and significantly improves the result in [Price, Song and Woodruff, ICALP'17], in which m=Ω(ϵ2d1+γ) for γ(0,1) is required. Moreover, we develop a novel analytical framework for guarantee regression that utilizes the *Oblivious Coordinate-wise Embedding* (OCE) property introduced in [Song and Yu, ICML'21]. Our analysis is much simpler and more general than that of [Price, Song and Woodruff, ICALP'17]. Leveraging this framework, we extend the guarantee regression result to dense sketching matrices for computing fast tensor product of vectors.

Chat is not available.