Skip to yearly menu bar Skip to main content


Poster

Rethinking Guidance Information to Utilize Unlabeled Samples: A Label Encoding Perspective

Yulong Zhang · Yuan Yao · Shuhao Chen · Pengrong Jin · Yu Zhang · Jian Jin · Jiangang Lu


Abstract:

Empirical Risk Minimization (ERM) has achieved great success in scenarios with sufficient labeled samples. However, many practical scenarios suffer from insufficient labeled samples. Under those scenarios, the ERM does not yield good performance as it cannot unleash the potential of unlabeled samples. In this paper, we rethink the guidance information to utilize unlabeled samples for handling those scenarios. By analyzing the learning objective of the ERM, we find that the guidance information for the labeled samples in a specific category is the corresponding label encoding. Inspired by this finding, we propose a Label-Encoding Risk Minimization (LERM) to mine the potential of unlabeled samples. It first estimates the label encodings through prediction means of unlabeled samples and then aligns them with their corresponding ground-truth label encodings. As a result, the LERM ensures both prediction discriminability and diversity and can be integrated into existing methods as a plugin. Theoretically, we analyze the relationship between the LERM and ERM. Empirically, we verify the superiority of the LERM under several label insufficient scenarios, including semi-supervised learning, unsupervised domain adaptation, and semi-supervised heterogeneous domain adaptation.

Live content is unavailable. Log in and register to view live content