Timezone: »

Learning with Bounded Instance- and Label-dependent Label Noise
Jiacheng Cheng · Tongliang Liu · Kotagiri Ramamohanarao · Dacheng Tao

Thu Jul 16 06:00 AM -- 06:45 AM & Thu Jul 16 05:00 PM -- 05:45 PM (PDT) @ None #None
Instance- and Label-dependent label Noise (ILN) widely exists in real-world datasets but has been rarely studied. In this paper, we focus on Bounded Instance- and Label-dependent label Noise (BILN), a particular case of ILN where the label noise rates---the probabilities that the true labels of examples flip into the corrupted ones---have upper bound less than $1$. Specifically, we introduce the concept of distilled examples, i.e. examples whose labels are identical with the labels assigned for them by the Bayes optimal classifier, and prove that under certain conditions classifiers learnt on distilled examples will converge to the Bayes optimal classifier. Inspired by the idea of learning with distilled examples, we then propose a learning algorithm with theoretical guarantees for its robustness to BILN. At last, empirical evaluations on both synthetic and real-world datasets show effectiveness of our algorithm in learning with BILN.

Author Information

Jiacheng Cheng (University of California, San Diego)
Tongliang Liu (The University of Sydney)
Kotagiri Ramamohanarao (The University of Melbourne)
Dacheng Tao (The University of Sydney)

More from the Same Authors