Skip to yearly menu bar Skip to main content

Workshop: Over-parameterization: Pitfalls and Opportunities

Generalization Error and Overparameterization While Learning over Networks

Martin Hellkvist · Ayca Ozcelikkale

Keywords: [ Stochastic Optimization ]


We investigate the performance of distributed learning for large-scale linear regression where the model unknowns are distributed over the network. We provide high-probability bounds on the generalization error for this distributed learning setting for isotropic as well as correlated Gaussian regressors. Our investigations show that the generalization error of the distributed solution can grow unboundedly even though the training error is low. We highlight the effect of partitioning of the training data over the network of learners on the generalization error. Our results are particularly interesting for the overparametrized scenario, illustrating fast convergence but also possibly unbounded generalization error.