Poster

A Novel Sequential Coreset Method for Gradient Descent Algorithms

Jiawei Huang · Ruomin Huang · wenjie liu · Nikolaos Freris · Hu Ding

Keywords: [ Optimization ]

[ Abstract ]
[ Paper ]
[ Visit Poster at Spot D2 in Virtual World ]
Tue 20 Jul 9 a.m. PDT — 11 a.m. PDT
 
Spotlight presentation: Optimization 1
Tue 20 Jul 6 a.m. PDT — 7 a.m. PDT

Abstract:

A wide range of optimization problems arising in machine learning can be solved by gradient descent algorithms, and a central question in this area is how to efficiently compress a large-scale dataset so as to reduce the computational complexity. Coreset is a popular data compression technique that has been extensively studied before. However, most of existing coreset methods are problem-dependent and cannot be used as a general tool for a broader range of applications. A key obstacle is that they often rely on the pseudo-dimension and total sensitivity bound that can be very high or hard to obtain. In this paper, based on the locality'' property of gradient descent algorithms, we propose a new framework, termedsequential coreset'', which effectively avoids these obstacles. Moreover, our method is particularly suitable for sparse optimization whence the coreset size can be further reduced to be only poly-logarithmically dependent on the dimension. In practice, the experimental results suggest that our method can save a large amount of running time compared with the baseline algorithms.

Chat is not available.