Timezone: »

 
Poster
Learning with Multiple Complementary Labels
LEI FENG · Takuo Kaneko · Bo Han · Gang Niu · Bo An · Masashi Sugiyama

Tue Jul 14 07:00 AM -- 07:45 AM & Tue Jul 14 07:00 PM -- 07:45 PM (PDT) @ None #None

A complementary label (CL) simply indicates an incorrect class of an example, but learning with CLs results in multi-class classifiers that can predict the correct class. Unfortunately, the problem setting of previous research only allows a single CL for each example, which notably limits its potential since our labelers may easily identify multiple complementary labels (MCLs) to one example. In this paper, we propose a novel problem setting to allow MCLs for each example and two ways for learning with MCLs. In the first way, we design two wrappers that decompose MCLs into many single CLs in different manners, so that we could use any method for learning with CLs. However, we find that the supervision information that MCLs hold is conceptually diluted after decomposition. Thus, in the second way, we derive an unbiased risk estimator; minimizing it processes each set of MCLs as a whole and possesses an estimation error bound. In addition, we improve the second way into minimizing properly chosen upper bounds for practical implementation. Experiments show that the former way works well for learning with MCLs while the latter is even better on various benchmark datasets.

Author Information

LEI FENG (Nanyang Technological University)
Takuo Kaneko (The University of Tokyo)
Bo Han (HKBU / RIKEN)
Gang Niu (RIKEN)
Bo An (Nanyang Technological University)
Masashi Sugiyama (RIKEN / The University of Tokyo)

More from the Same Authors