Skip to yearly menu bar Skip to main content


Poster

Fast rates for noisy interpolation require rethinking the effect of inductive bias

Konstantin Donhauser · Nicolò Ruggeri · Stefan Stojanovic · Fanny Yang

Hall E #1109

Keywords: [ MISC: Supervised Learning ] [ T: Deep Learning ] [ T: Learning Theory ]


Abstract: Good generalization performance on high-dimensional data crucially hinges on a simple structure of the ground truth and a corresponding strong inductive bias of the estimator. Even though this intuition is valid for regularized models, in this paper we caution against a strong inductive bias for interpolation in the presence of noise: While a stronger inductive bias encourages a simpler structure that is more aligned with the ground truth, it also increases the detrimental effect of noise. Specifically, for both linear regression and classification with a sparse ground truth, we prove that minimum $\ell_p$-norm and maximum $\ell_p$-margin interpolators achieve fast polynomial rates close to order $1/n$ for $p > 1$ compared to a logarithmic rate for $p = 1$. Finally, we provide preliminary experimental evidence that this trade-off may also play a crucial role in understanding non-linear interpolating models used in practice.

Chat is not available.