Skip to yearly menu bar Skip to main content


Poster

Generic Coreset for Scalable Learning of Monotonic Kernels: Logistic Regression, Sigmoid and more

Elad Tolochinksy · Ibrahim Jubran · Dan Feldman

Hall E #1126

Keywords: [ PM: Monte Carlo and Sampling Methods ] [ DL: Theory ] [ MISC: General Machine Learning Techniques ] [ T: Learning Theory ]


Abstract: Coreset (or core-set) is a small weighted \emph{subset} QQ of an input set PP with respect to a given \emph{monotonic} function f:RR that \emph{provably} approximates its fitting loss pPf(px) to \emph{any} given xRd. Using Q we can obtain an approximation of x that minimizes this loss, by running \emph{existing} optimization algorithms on Q. In this work we provide: (i) A lower bound which proves that there are sets with no coresets smaller than n=|P| for general monotonic loss functions. (ii) A proof that, with an additional common regularization term and under a natural assumption that holds e.g. for logistic regression and the sigmoid activation functions, a small coreset exists for \emph{any} input P. (iii) A generic coreset construction algorithm that computes such a small coreset Q in O(nd+nlogn) time, and (iv) Experimental results with open-source code which demonstrate that our coresets are effective and are much smaller in practice than predicted in theory.

Chat is not available.