Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Over-parameterization: Pitfalls and Opportunities

Increasing Depth Leads to U-Shaped Test Risk in Over-parameterized Convolutional Networks

Eshaan Nichani · Adityanarayanan Radhakrishnan · Caroline Uhler

Keywords: [ Gaussian Processes and Bayesian non-parametrics ]


Abstract:

Recent works have demonstrated that increasing model capacity through width in over-parameterized neural networks leads to a decrease in test risk. Model capacity, however, can also be increased through depth, yet understanding the impact of increasing depth on test risk remains an open question. In this work, we demonstrate that the test risk of over-parameterized convolutional networks is a U-shaped curve (i.e. monotonically decreasing, then increasing) with increasing depth. We first provide empirical evidence for this phenomenon via image classification experiments using both ResNets and the convolutional neural tangent kernel (CNTK). We then present a novel linear regression framework for characterizing the impact of depth on test risk, and show that increasing depth leads to a U-shaped test risk for the linear CNTK. In particular, we prove that the linear CNTK corresponds to a depth-dependent linear transformation on the original space and characterize properties of this transformation. We then analyze over-parameterized linear regression under arbitrary linear transformations and, in simplified settings, identify the depths which minimize the bias and variance terms of the test risk.