Skip to yearly menu bar Skip to main content


Poster

Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks

Blake Bordelon · Abdulkadir Canatar · Cengiz Pehlevan

Keywords: [ Deep Learning Theory ] [ Kernel Methods ] [ Supervised Learning ] [ General Machine Learning Techniques ]


Abstract:

We derive analytical expressions for the generalization performance of kernel regression as a function of the number of training samples using theoretical methods from Gaussian processes and statistical physics. Our expressions apply to wide neural networks due to an equivalence between training them and kernel regression with the Neural Tangent Kernel (NTK). By computing the decomposition of the total generalization error due to different spectral components of the kernel, we identify a new spectral principle: as the size of the training set grows, kernel machines and neural networks fit successively higher spectral modes of the target function. When data are sampled from a uniform distribution on a high-dimensional hypersphere, dot product kernels, including NTK, exhibit learning stages where different frequency modes of the target function are learned. We verify our theory with simulations on synthetic data and MNIST dataset.

Chat is not available.