Timezone: »
Recently there are a considerable amount of work devoted to the study of the algorithmic stability and generalization for stochastic gradient descent (SGD). However, the existing stability analysis requires to impose restrictive assumptions on the boundedness of gradients, smoothness and convexity of loss functions. In this paper, we provide a fine-grained analysis of stability and generalization for SGD by substantially relaxing these assumptions. Firstly, we establish stability and generalization for SGD by removing the existing bounded gradient assumptions. The key idea is the introduction of a new stability measure called on-average model stability, for which we develop novel bounds controlled by the risks of SGD iterates. This yields generalization bounds depending on the behavior of the best model, and leads to the first-ever-known fast bounds in the low-noise setting using stability approach. Secondly, the smoothness assumption is relaxed by considering loss functions with Holder continuous (sub)gradients for which we show that optimal bounds are still achieved by balancing computation and stability. To our best knowledge, this gives the first-ever-known stability and generalization bounds for SGD with non-smooth loss functions (e.g., hinge loss). Finally, we study learning problems with (strongly) convex objectives but non-convex loss functions.
Author Information
Yunwen Lei (University of Kaiserslautern)
Yiming Ying (SUNY Albany)
More from the Same Authors
-
2023 Poster: Generalization Analysis for Contrastive Representation Learning »
Yunwen Lei · Tianbao Yang · Yiming Ying · Ding-Xuan Zhou -
2022 Poster: On the Generalization Analysis of Adversarial Learning »
Waleed Mustafa · Yunwen Lei · Marius Kloft -
2022 Spotlight: On the Generalization Analysis of Adversarial Learning »
Waleed Mustafa · Yunwen Lei · Marius Kloft -
2021 Poster: Stability and Generalization of Stochastic Gradient Methods for Minimax Problems »
Yunwen Lei · Zhenhuan Yang · Tianbao Yang · Yiming Ying -
2021 Oral: Stability and Generalization of Stochastic Gradient Methods for Minimax Problems »
Yunwen Lei · Zhenhuan Yang · Tianbao Yang · Yiming Ying -
2021 Poster: Federated Deep AUC Maximization for Hetergeneous Data with a Constant Communication Complexity »
Zhuoning Yuan · Zhishuai Guo · Yi Xu · Yiming Ying · Tianbao Yang -
2021 Spotlight: Federated Deep AUC Maximization for Hetergeneous Data with a Constant Communication Complexity »
Zhuoning Yuan · Zhishuai Guo · Yi Xu · Yiming Ying · Tianbao Yang -
2019 Poster: Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization »
Baojian Zhou · Feng Chen · Yiming Ying -
2019 Oral: Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization »
Baojian Zhou · Feng Chen · Yiming Ying -
2018 Poster: Stochastic Proximal Algorithms for AUC Maximization »
Michael Natole Jr · Yiming Ying · Siwei Lyu -
2018 Oral: Stochastic Proximal Algorithms for AUC Maximization »
Michael Natole Jr · Yiming Ying · Siwei Lyu