Timezone: »
Algorithmic stability is a key characteristic to ensure the generalization ability of a learning algorithm. Among different notions of stability, \emph{uniform stability} is arguably the most popular one, which yields exponential generalization bounds. However, uniform stability only considers the worst-case loss change (or so-called sensitivity) by removing a single data point, which is distribution-independent and therefore undesirable. There are many cases that the worst-case sensitivity of the loss is much larger than the average sensitivity taken over the single data point that is removed, especially in some advanced models such as random feature models or neural networks. Many previous works try to mitigate the distribution independent issue by proposing weaker notions of stability, however, they either only yield polynomial bounds or the bounds derived do not vanish as sample size goes to infinity. Given that, we propose \emph{locally elastic stability} as a weaker and distribution-dependent stability notion, which still yields exponential generalization bounds. We further demonstrate that locally elastic stability implies tighter generalization bounds than those derived based on uniform stability in many situations by revisiting the examples of bounded support vector machines, regularized least square regressions, and stochastic gradient descent.
Author Information
Zhun Deng (Harvard)
Hangfeng He (University of Pennsylvania)
Weijie Su (University of Pennsylvania)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Spotlight: Toward Better Generalization Bounds with Locally Elastic Stability »
Thu. Jul 22nd 02:35 -- 02:40 AM Room
More from the Same Authors
-
2021 : On the Convergence of Deep Learning with Differential Privacy »
Zhiqi Bu · Hua Wang · Qi Long · Weijie Su -
2023 : Reward Collapse in Aligning Large Language Models: A Prompt-Aware Approach to Preference Rankings »
Ziang Song · Tianle Cai · Jason Lee · Weijie Su -
2023 Poster: The Implicit Regularization of Dynamical Stability in Stochastic Gradient Descent »
Lei Wu · Weijie Su -
2022 Poster: ROCK: Causal Inference Principles for Reasoning about Commonsense Causality »
Jiayao Zhang · Hongming ZHANG · Weijie Su · Dan Roth -
2022 Poster: When and How Mixup Improves Calibration »
Linjun Zhang · Zhun Deng · Kenji Kawaguchi · James Zou -
2022 Spotlight: When and How Mixup Improves Calibration »
Linjun Zhang · Zhun Deng · Kenji Kawaguchi · James Zou -
2022 Poster: Robustness Implies Generalization via Data-Dependent Generalization Bounds »
Kenji Kawaguchi · Zhun Deng · Kyle Luh · Jiaoyang Huang -
2022 Oral: Robustness Implies Generalization via Data-Dependent Generalization Bounds »
Kenji Kawaguchi · Zhun Deng · Kyle Luh · Jiaoyang Huang -
2022 Spotlight: ROCK: Causal Inference Principles for Reasoning about Commonsense Causality »
Jiayao Zhang · Hongming ZHANG · Weijie Su · Dan Roth -
2021 Poster: Oneshot Differentially Private Top-k Selection »
Gang Qiao · Weijie Su · Li Zhang -
2021 Spotlight: Oneshot Differentially Private Top-k Selection »
Gang Qiao · Weijie Su · Li Zhang -
2020 Poster: Interpreting Robust Optimization via Adversarial Influence Functions »
Zhun Deng · Cynthia Dwork · Jialiang Wang · Linjun Zhang -
2020 Poster: Sharp Composition Bounds for Gaussian Differential Privacy via Edgeworth Expansion »
Qinqing Zheng · Jinshuo Dong · Qi Long · Weijie Su -
2020 Poster: Towards Understanding the Dynamics of the First-Order Adversaries »
Zhun Deng · Hangfeng He · Jiaoyang Huang · Weijie Su