Poster

Near Input Sparsity Time Kernel Embeddings via Adaptive Sampling

David Woodruff · Amir Zandieh

Keywords: [ Dimensionality Reduction ] [ Kernel Methods ] [ Matrix/Tensor Methods ] [ General Machine Learning Techniques ]

[ Abstract ] [ Join Zoom
Please do not share or post zoom links

Abstract: To accelerate kernel methods, we propose a near input sparsity time method for sampling the high-dimensional space implicitly defined by a kernel transformation. Our main contribution is an importance sampling method for subsampling the feature space of a degree $q$ tensoring of data points in almost input sparsity time, improving the recent oblivious sketching of (Ahle et al., 2020) by a factor of $q^{5/2}/\epsilon^2$. This leads to a subspace embedding for the polynomial kernel as well as the Gaussian kernel with a target dimension that is only linearly dependent on the statistical dimension of the kernel and in time which is only linearly dependent on the sparsity of the input dataset. We show how our subspace embedding bounds imply new statistical guarantees for kernel ridge regression. Furthermore, we empirically show that in large-scale regression tasks, our algorithm outperforms state-of-the-art kernel approximation methods.

Chat is not available.