Skip to yearly menu bar Skip to main content


Poster

InterLUDE: Interactions between Labeled and Unlabeled Data to Enhance Semi-Supervised Learning

Zhe Huang · Xiaowei Yu · Dajiang Zhu · Michael Hughes

Hall C 4-9 #2211
[ ]
Tue 23 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract:

Semi-supervised learning (SSL) seeks to enhance task performance by training on both labeled and unlabeled data. Mainstream SSL image classification methods mostly optimize a loss that additively combines a supervised classification objective with a regularization term derived solely from unlabeled data. This formulation often neglects the potential for interaction between labeled and unlabeled images. In this paper, we introduce InterLUDE, a new approach to enhance SSL made of two parts that each benefit from labeled-unlabeled interaction. The first part, embedding fusion, interpolates between labeled and unlabeled embeddings to improve representation learning. The second part is a new loss, grounded in the principle of consistency regularization, that aims to minimize discrepancies in the model's predictions between labeled versus unlabeled inputs. Experiments on standard closed-set SSL benchmarks and a medical SSL task with an uncurated unlabeled set show clear benefits to our approach. On the STL-10 dataset with only 40 labels, InterLUDE achieves 3.2% error rate, while the best previous method reports 6.3%.

Live content is unavailable. Log in and register to view live content