Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI for Science

Predicting generalization with degrees of freedom in neural networks

Erin Grant · Yan Wu


Abstract:

Model complexity is fundamentally tied to predictive power in the sciences as well as in applications. However, there is a divergence between naive measures of complexity such as parameter count and the generalization performance of overparameterized machine learning models. Prior empirical approaches to capturing intrinsic complexity independent of parameter count are computationally intractable, do not capture the implicitly regularizing effects of the entire machine-learning pipeline, or do not provide a quantitative fit to the double descent behavior of overparameterized models. In this work, we introduce an empirical complexity measure inspired by the classical notion of degrees of freedom in statistics. This measure can be approximated efficiently and is a function of the entire model training pipeline. We demonstrate that this measure strongly correlates with generalization performance in the double-descent regime.

Chat is not available.