Skip to yearly menu bar Skip to main content


Poster

Searching to Exploit Memorization Effect in Learning with Noisy Labels

QUANMING YAO · Hansi Yang · Bo Han · Gang Niu · James Kwok

Keywords: [ Transfer, Multitask and Meta-learning ] [ Semi-supervised learning ] [ Meta-learning and Automated ML ] [ Optimization ] [ Non-convex Optimization ]


Abstract:

Sample selection approaches are popular in robust learning from noisy labels. However, how to properly control the selection process so that deep networks can benefit from the memorization effect is a hard problem. In this paper, motivated by the success of automated machine learning (AutoML), we model this issue as a function approximation problem. Specifically, we design a domain-specific search space based on general patterns of the memorization effect and propose a novel Newton algorithm to solve the bi-level optimization problem efficiently. We further provide a theoretical analysis of the algorithm, which ensures a good approximation to critical points. Experiments are performed on both benchmark and real-world data sets. Results demonstrate that the proposed method is much better than the state-of-the-art noisy-label-learning approaches, and also much more efficient than existing AutoML algorithms.

Chat is not available.