Timezone: »
Recent years have witnessed exciting progress in the study of stochastic variance reduced gradient methods (e.g., SVRG, SAGA), their accelerated variants (e.g, Katyusha) and their extensions in many different settings (e.g., online, sparse, asynchronous, distributed). Among them, accelerated methods enjoy improved convergence rates but have complex coupling structures, which makes them hard to be extended to more settings (e.g., sparse and asynchronous) due to the existence of perturbation. In this paper, we introduce a simple stochastic variance reduced algorithm (MiG), which enjoys the best-known convergence rates for both strongly convex and non-strongly convex problems. Moreover, we also present its efficient sparse and asynchronous variants, and theoretically analyze its convergence rates in these settings. Finally, extensive experiments for various machine learning problems such as logistic regression are given to illustrate the practical improvement in both serial and asynchronous settings.
Author Information
Kaiwen Zhou (The Chinese University of Hong Kong)
Fanhua Shang (The Chinese University of Hong Kong)
James Cheng (CUHK)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates »
Wed. Jul 11th 04:15 -- 07:00 PM Room Hall B #205
More from the Same Authors
-
2021 : Learned Interpretable Residual Extragradient ISTA for Sparse Coding »
· Connie Kong · Fanhua Shang -
2022 : Invariance Principle Meets Out-of-Distribution Generalization on Graphs »
Yongqiang Chen · Yonggang Zhang · Yatao Bian · Han Yang · Kaili MA · Binghui Xie · Tongliang Liu · Bo Han · James Cheng -
2022 : Pareto Invariant Risk Minimization »
Yongqiang Chen · Kaiwen Zhou · Yatao Bian · Binghui Xie · Kaili MA · Yonggang Zhang · Han Yang · Bo Han · James Cheng -
2023 : Towards Understanding Feature Learning in Out-of-Distribution Generalization »
Yongqiang Chen · Wei Huang · Kaiwen Zhou · Yatao Bian · Bo Han · James Cheng -
2022 Poster: Kill a Bird with Two Stones: Closing the Convergence Gaps in Non-Strongly Convex Optimization by Directly Accelerated SVRG with Double Compensation and Snapshots »
Yuanyuan Liu · Fanhua Shang · Weixin An · Hongying Liu · Zhouchen Lin -
2022 Spotlight: Kill a Bird with Two Stones: Closing the Convergence Gaps in Non-Strongly Convex Optimization by Directly Accelerated SVRG with Double Compensation and Snapshots »
Yuanyuan Liu · Fanhua Shang · Weixin An · Hongying Liu · Zhouchen Lin -
2022 Poster: On the Finite-Time Complexity and Practical Computation of Approximate Stationarity Concepts of Lipschitz Functions »
Lai Tian · Kaiwen Zhou · Anthony Man-Cho So -
2022 Poster: Fast and Reliable Evaluation of Adversarial Robustness with Minimum-Margin Attack »
Ruize Gao · Jiongxiao Wang · Kaiwen Zhou · Feng Liu · Binghui Xie · Gang Niu · Bo Han · James Cheng -
2022 Spotlight: Fast and Reliable Evaluation of Adversarial Robustness with Minimum-Margin Attack »
Ruize Gao · Jiongxiao Wang · Kaiwen Zhou · Feng Liu · Binghui Xie · Gang Niu · Bo Han · James Cheng -
2022 Spotlight: On the Finite-Time Complexity and Practical Computation of Approximate Stationarity Concepts of Lipschitz Functions »
Lai Tian · Kaiwen Zhou · Anthony Man-Cho So