Timezone: »

 
Poster
Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number
Zaiyi Chen · Yi Xu · Haoyuan Hu · Tianbao Yang

Tue Jun 11 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #100
An important class of non-convex objectives that has wide applications in machine learning consists of a sum of $n$ smooth functions and a non-smooth convex function. Tremendous studies have been devoted to conquering these problems by leveraging one of the two types of variance reduction techniques, i.e., SVRG-type that computes a full gradient occasionally and SAGA-type that maintains $n$ stochastic gradients at every iteration. In practice, SVRG-type is preferred to SAGA-type due to its potentially less memory costs. An interesting question that has been largely ignored is how to improve the complexity of variance reduction methods for problems with a large condition number that measures the degree to which the objective is close to a convex function. In this paper, we present a simple but non-trivial boosting of a state-of-the-art SVRG-type method for convex problems (namely Katyusha) to enjoy an improved complexity for solving non-convex problems with a large condition number (that is close to a convex function). To the best of our knowledge, its complexity has the best dependence on $n$ and the degree of non-convexity, and also matches that of a recent SAGA-type accelerated stochastic algorithm for a constrained non-convex smooth optimization problem.

Author Information

Zaiyi Chen (Cainiao AI)
Yi Xu (The University of Iowa)
Haoyuan Hu (Artificial Intelligence Department, Zhejiang Cainiao Supply Chain Management Co.)
Tianbao Yang (The University of Iowa)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors