Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Over-parameterization: Pitfalls and Opportunities

Empirical Study on the Effective VC Dimension of Low-rank Neural Networks

Daewon Seo · Hongyi Wang · Dimitris Papailiopoulos · Kangwook Lee


Abstract:

A recent study on overparameterized neural networks by Huh et al. (2021) finds that gradient-based training of overparameterized neural networks finds low-rank parameters, implying that certain implicit low-rank constraints are in action. Inspired by this, we empirically study the effective VC dimension of neural networks with low-rank parameters. Our finding is that their effective VC dimension is proportional to a specific weighted sum of per-layer parameter counts, which we call the effective number of parameters. As the effective VC dimension lower bounds the VC dimension, our result suggests a possibility that the analytic VC dimension upper bound proposed by Bartlett et al. (2019) is indeed tight for neural networks with low-rank parameters.