Skip to yearly menu bar Skip to main content


Poster

On the Effectiveness of Supervision in Non-Contrastive Representation Learning

Jeongheon Oh · Kibok Lee


Abstract:

Supervised contrastive representation learning has shown to be effective in many transfer learning scenarios.However, while non-contrastive learning often outperforms its contrastive learning counterpart in self-supervised representation learning, the extension of non-contrastive representation learning to supervised scenarios is less explored.To bridge the gap, we study non-contrastive learning for supervised representation learning, coined SupBYOL and SupSiam, which leverages labels in non-contrastive learning to achieve better representations.The proposed supervised non-contrastive learning framework improves representation learning while avoiding collapse.Our theoretical analysis reveals that providing supervision to non-contrastive learning reduces intra-class variance, and the contribution of supervision should be adjusted to achieve the best performance.In experiments, we show the superiority of supervised non-contrastive learning across various datasets and tasks.The code will be released.

Live content is unavailable. Log in and register to view live content