Timezone: »
Surprising benefits of ridge regularization for noiseless regression
Konstantin Donhauser · Alexandru Tifrea · Michael Aerni · Reinhard Heckel · Fanny Yang
Numerous recent works show that overparameterization implicitly reduces variance for minimum-norm interpolators, suggesting vanishing benefits for ridge regularization in high dimensions. However, empirical findings suggest that this narrative may not hold true for robust generalization. In this paper we reveal that for overparameterized linear regression, the robust risk is minimized for a positive regularization coefficient even when the training data is noiseless. Hence, we effectively provide, to the best of our knowledge, the first theoretical analysis on the phenomenon of robust overfitting.
Author Information
Konstantin Donhauser (ETH Zürich)
Alexandru Tifrea (ETH Zurich)
Michael Aerni (ETH Zürich)
Reinhard Heckel (TU Munich)
Fanny Yang (ETH)
More from the Same Authors
-
2021 : Maximizing the robust margin provably overfits on noiseless data »
Fanny Yang · Reinhard Heckel · Michael Aerni · Alexandru Tifrea · Konstantin Donhauser -
2021 : Novel disease detection using ensembles with regularized disagreement »
Alexandru Tifrea · Eric Stavarache · Fanny Yang -
2022 : Why adversarial training can hurt robust accuracy »
jacob clarysse · Julia Hörrmann · Fanny Yang -
2022 : Provable Concept Learning for Interpretable Predictions Using Variational Autoencoders »
Armeen Taeb · Nicolò Ruggeri · Carina Schnuck · Fanny Yang -
2022 : Monotonic Risk Relationships under Distribution Shifts for Regularized Risk Minimization »
Daniel LeJeune · Jiayu Liu · Reinhard Heckel -
2022 Poster: Fast rates for noisy interpolation require rethinking the effect of inductive bias »
Konstantin Donhauser · Nicolò Ruggeri · Stefan Stojanovic · Fanny Yang -
2022 Spotlight: Fast rates for noisy interpolation require rethinking the effect of inductive bias »
Konstantin Donhauser · Nicolò Ruggeri · Stefan Stojanovic · Fanny Yang -
2021 Poster: How rotational invariance of common kernels prevents generalization in high dimensions »
Konstantin Donhauser · Mingqi Wu · Fanny Yang -
2021 Spotlight: How rotational invariance of common kernels prevents generalization in high dimensions »
Konstantin Donhauser · Mingqi Wu · Fanny Yang -
2020 : QA for invited talk 3 Yang »
Fanny Yang -
2020 : Invited talk 3 Yang »
Fanny Yang -
2020 Poster: Understanding and Mitigating the Tradeoff between Robustness and Accuracy »
Aditi Raghunathan · Sang Michael Xie · Fanny Yang · John Duchi · Percy Liang