Timezone: »

 
Improved Generalization Bounds for Transfer Learning via Neural Collapse
Tomer Galanti · Andras Gyorgy · Marcus Hutter
Event URL: https://openreview.net/forum?id=VrK7pKwOhT_ »

Using representations learned by large, pretrained models, also called foundation models, in new tasks with fewer data has been successful in a wide range of machine learning problems. Recently, Galanti et al. (2022) introduced a theoretical framework for studying this transfer learning setting for classification. Their analysis is based on the recently observed phenomenon that the features learned by overparameterized deep classification networks show an interesting clustering property, called neural collapse (Papyan et al., 2020). A cornerstone of their analysis demonstrates that neural collapse generalizes from the source classes to new target classes. However, this analysis is limited as it relies on several unrealistic assumptions. In this work, we provide an improved theoretical analysis significantly relaxing these modeling assumptions.

Author Information

Tomer Galanti (Massachusetts Institute of Technology)
Andras Gyorgy (DeepMind)
Marcus Hutter (DeepMind)

More from the Same Authors