Timezone: »
Spotlight
On the Finite-Time Complexity and Practical Computation of Approximate Stationarity Concepts of Lipschitz Functions
Lai Tian · Kaiwen Zhou · Anthony Man-Cho So
We report a practical finite-time algorithmic scheme to compute approximately stationary points for nonconvex nonsmooth Lipschitz functions. In particular, we are interested in two kinds of approximate stationarity notions for nonconvex nonsmooth problems, i.e., Goldstein approximate stationarity (GAS) and near-approximate stationarity (NAS). For GAS, our scheme removes the unrealistic subgradient selection oracle assumption in (Zhang et al., 2020, Assumption 1) and computes GAS with the same finite-time complexity. For NAS, Davis & Drusvyatskiy (2019) showed that $\rho$-weakly convex functions admit finite-time computation, while Tian & So (2021) provided the matching impossibility results of dimension-free finite-time complexity for first-order methods. Complement to these developments, in this paper, we isolate a new class of functions that could be Clarke irregular (and thus not weakly convex anymore) and show that our new algorithmic scheme can compute NAS points for functions in that class within finite time. To demonstrate the wide applicability of our new theoretical framework, we show that $\rho$-margin SVM, $1$-layer, and $2$-layer ReLU neural networks, all being Clarke irregular, satisfy our new conditions.
Author Information
Lai Tian (The Chinese University of Hong Kong)
Kaiwen Zhou (The Chinese University of Hong Kong)
Anthony Man-Cho So (The Chinese University of Hong Kong)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 Poster: On the Finite-Time Complexity and Practical Computation of Approximate Stationarity Concepts of Lipschitz Functions »
Tue. Jul 19th through Wed the 20th Room Hall E #1207
More from the Same Authors
-
2022 : Pareto Invariant Risk Minimization »
Yongqiang Chen · Kaiwen Zhou · Yatao Bian · Binghui Xie · Kaili MA · Yonggang Zhang · Han Yang · Bo Han · James Cheng -
2023 : Towards Understanding Feature Learning in Out-of-Distribution Generalization »
Yongqiang Chen · Wei Huang · Kaiwen Zhou · Yatao Bian · Bo Han · James Cheng -
2023 Poster: Projected Tensor Power Method for Hypergraph Community Recovery »
Jinxin Wang · Yuen-Man Pun · Xiaolu Wang · Peng Wang · Anthony Man-Cho So -
2022 Poster: Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering »
Peng Wang · Huikang Liu · Anthony Man-Cho So · Laura Balzano -
2022 Poster: Fast and Reliable Evaluation of Adversarial Robustness with Minimum-Margin Attack »
Ruize Gao · Jiongxiao Wang · Kaiwen Zhou · Feng Liu · Binghui Xie · Gang Niu · Bo Han · James Cheng -
2022 Spotlight: Fast and Reliable Evaluation of Adversarial Robustness with Minimum-Margin Attack »
Ruize Gao · Jiongxiao Wang · Kaiwen Zhou · Feng Liu · Binghui Xie · Gang Niu · Bo Han · James Cheng -
2022 Spotlight: Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering »
Peng Wang · Huikang Liu · Anthony Man-Cho So · Laura Balzano -
2021 Poster: Optimal Non-Convex Exact Recovery in Stochastic Block Model via Projected Power Method »
Peng Wang · Huikang Liu · Zirui Zhou · Anthony Man-Cho So -
2021 Spotlight: Optimal Non-Convex Exact Recovery in Stochastic Block Model via Projected Power Method »
Peng Wang · Huikang Liu · Zirui Zhou · Anthony Man-Cho So -
2020 Poster: A Nearly-Linear Time Algorithm for Exact Community Recovery in Stochastic Block Model »
Peng Wang · Zirui Zhou · Anthony Man-Cho So -
2018 Poster: A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates »
Kaiwen Zhou · Fanhua Shang · James Cheng -
2018 Oral: A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates »
Kaiwen Zhou · Fanhua Shang · James Cheng