Oral
Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization
Michael Metel · Akiko Takeda

Wed Jun 12th 02:30 -- 02:35 PM @ Room 104

Our work focuses on stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on this class of problem is quite limited, and until very recently no non-asymptotic convergence results have been reported. We present two simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state of the art. We also demonstrate superior performance of our algorithms in practice for empirical risk minimization on well known datasets.

Author Information

Michael Metel (RIKEN Center for Advanced Intelligence Project)
Akiko Takeda (The University of Tokyo / RIKEN)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors