Skip to yearly menu bar Skip to main content


Oral

Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization

Michael Metel · Akiko Takeda

Abstract:

Our work focuses on stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on this class of problem is quite limited, and until very recently no non-asymptotic convergence results have been reported. We present two simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state of the art. We also demonstrate superior performance of our algorithms in practice for empirical risk minimization on well known datasets.

Chat is not available.