Timezone: »
Poster
Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization
Michael Metel · Akiko Takeda
Our work focuses on stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on this class of problem is quite limited, and until recently no non-asymptotic convergence results have been reported. We present two simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state-of-the-art. We also compare our algorithms' performance in practice for empirical risk minimization.
Author Information
Michael Metel (RIKEN Center for Advanced Intelligence Project)
Akiko Takeda (The University of Tokyo / RIKEN)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Oral: Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization »
Wed Jun 12th 09:30 -- 09:35 PM Room Room 104
More from the Same Authors
-
2018 Poster: Nonconvex Optimization for Regression with Fairness Constraints »
Junpei Komiyama · Akiko Takeda · Junya Honda · Hajime Shimao -
2018 Oral: Nonconvex Optimization for Regression with Fairness Constraints »
Junpei Komiyama · Akiko Takeda · Junya Honda · Hajime Shimao