Skip to yearly menu bar Skip to main content


Plenary Talk
in
Workshop: Beyond first-order methods in machine learning systems

Faster Empirical Risk Minimization

Jelena Diakonikolas


Abstract:

Empirical Risk Minimization (ERM) problems are central to machine learning, and their efficient optimization has been studied from different perspectives, often taking advantage of the finite sum structure present in typical problem formulations. In particular, tight oracle complexity bounds have been obtained under fairly general assumptions about the loss functions. In this talk, I will present a rather surprising and general result that takes advantage of the separability of nonsmooth convex loss functions with efficiently computable proximal operators -- such as, e.g., the hinge loss and the sum of absolute errors -- to obtain an algorithm that exhibits significantly lower complexity than what is predicted by the lower bounds for general nonsmooth convex losses. Crucial to this result is the use of proximal operators, which goes beyond the simple gradient oracle access to the function. The talk is based on joint work with Chaobing Song and Stephen Wright.