Timezone: »

Nonparametric Iterative Machine Teaching
CHEN ZHANG · Xiaofeng Cao · Weiyang Liu · Ivor Tsang · James Kwok

Thu Jul 27 01:30 PM -- 03:00 PM (PDT) @ Exhibit Hall 1 #318
Event URL: https://github.com/chen2hang/NonparametricTeaching »

In this paper, we consider the problem of Iterative Machine Teaching (IMT), where the teacher provides examples to the learner iteratively such that the learner can achieve fast convergence to a target model. However, existing IMT algorithms are solely based on parameterized families of target models. They mainly focus on convergence in the parameter space, resulting in difficulty when the target models are defined to be functions without dependency on parameters. To address such a limitation, we study a more general task -- Nonparametric Iterative Machine Teaching (NIMT), which aims to teach nonparametric target models to learners in an iterative fashion. Unlike parametric IMT that merely operates in the parameter space, we cast NIMT as a functional optimization problem in the function space. To solve it, we propose both random and greedy functional teaching algorithms. We obtain the iterative teaching dimension (ITD) of the random teaching algorithm under proper assumptions, which serves as a uniform upper bound of ITD in NIMT. Further, the greedy teaching algorithm has a significantly lower ITD, which reaches a tighter upper bound of ITD in NIMT. Finally, we verify the correctness of our theoretical findings with extensive experiments in nonparametric scenarios.

Author Information

CHEN ZHANG (Jilin University)
Xiaofeng Cao (Jilin University)
Weiyang Liu (University of Cambridge)
Ivor Tsang (University of Technology Sydney)
James Kwok (Hong Kong University of Science and Technology)

More from the Same Authors