Workshop: Over-parameterization: Pitfalls and Opportunities

Overfitting of Polynomial Regression with Overparameterization

Hugo Fabregues · Berfin Simsek

Abstract: We apply the theory introduced in \cite{jacot2020kernel} to study the risk (i.e. generalization error) in a polynomial regression setup where the data points $ x_i $'s are i.i.d. samples from the uniform distribution on $[-1,1]$ and the observations $ y_i = f^*(x_i) + \epsilon e_i $ are generated by a continuous true function $ f^* $ where $ e_i $'s are standard Gaussian noise and $ \epsilon $ is the noise level. In our setup, we can precisely compute the Signal Capture Threshold (SCT) as a function of the number of polynomial features $ P+1 $, the number of samples $ N $, and the ridge $ \lambda > 0 $ which enables a precise analysis of the risk and an explanation of the overfitting of polynomial features with overparameterization.