Learning with Bounded Instance- and Label-dependent Label Noise

Jiacheng Cheng · Tongliang Liu · Kotagiri Ramamohanarao · Dacheng Tao

Keywords: [ Semi-supervised Learning ] [ Unsupervised and Semi-Supervised Learning ]

[ Abstract ] [ Join Zoom
Please do not share or post zoom links

Abstract: Instance- and Label-dependent label Noise (ILN) widely exists in real-world datasets but has been rarely studied. In this paper, we focus on Bounded Instance- and Label-dependent label Noise (BILN), a particular case of ILN where the label noise rates---the probabilities that the true labels of examples flip into the corrupted ones---have upper bound less than $1$. Specifically, we introduce the concept of distilled examples, i.e. examples whose labels are identical with the labels assigned for them by the Bayes optimal classifier, and prove that under certain conditions classifiers learnt on distilled examples will converge to the Bayes optimal classifier. Inspired by the idea of learning with distilled examples, we then propose a learning algorithm with theoretical guarantees for its robustness to BILN. At last, empirical evaluations on both synthetic and real-world datasets show effectiveness of our algorithm in learning with BILN.

Chat is not available.