Timezone: »

Surprising benefits of ridge regularization for noiseless regression
Konstantin Donhauser · Alexandru Tifrea · Michael Aerni · Reinhard Heckel · Fanny Yang

Numerous recent works show that overparameterization implicitly reduces variance for minimum-norm interpolators, suggesting vanishing benefits for ridge regularization in high dimensions. However, empirical findings suggest that this narrative may not hold true for robust generalization. In this paper we reveal that for overparameterized linear regression, the robust risk is minimized for a positive regularization coefficient even when the training data is noiseless. Hence, we effectively provide, to the best of our knowledge, the first theoretical analysis on the phenomenon of robust overfitting.

Author Information

Konstantin Donhauser (ETH Zürich)
Alexandru Tifrea (ETH Zurich)
Michael Aerni (ETH Zürich)
Reinhard Heckel (TU Munich)
Fanny Yang (ETH)

More from the Same Authors