Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Topology, Algebra, and Geometry in Machine Learning

Invariance-adapted decomposition and Lasso-type contrastive learning

Masanori Koyama · Takeru Miyato · Kenji Fukumizu


Abstract:

Recent years have witnessed the effectiveness ofcontrastive learning in obtaining the representa-tion of dataset that is useful in interpretation anddownstream tasks. However, the mechanism thatdescribes this effectiveness have not been thor-oughly analyzed, and many studies have been con-ducted to investigate the data structures capturedby contrastive learning. In particular, the recentstudy of von K ̈ugelgen et al. (2021) has shownthat contrastive learning is capable of decompos-ing the data space into the space that is invariant toall augmentations and its complement. In this pa-per, we introduce the notion of invariance-adaptedlatent space that decomposes the data space intothe intersections of the invariant spaces of eachaugmentation and their complements. This de-composition generalizes the one introduced invon K ̈ugelgen et al. (2021), and describes a struc-ture that is analogous to the frequencies in theharmonic analysis of a group. We experimentallyshow that contrastive learning with lasso-type met-ric can be used to find an invariance-adapted latentspace, thereby suggesting a new potential for thecontrastive learning. We also investigate whensuch a latent space can be identified up to mixingswithin each component.

Chat is not available.