Timezone: »

A Theory of Label Propagation for Subpopulation Shift
Tianle Cai · Ruiqi Gao · Jason Lee · Qi Lei

Wed Jul 21 09:00 PM -- 11:00 PM (PDT) @ Virtual #None

One of the central problems in machine learning is domain adaptation. Different from past theoretical works, we consider a new model of subpopulation shift in the input or representation space. In this work, we propose a provably effective framework based on label propagation by using an input consistency loss. In our analysis we used a simple but realistic “expansion” assumption, which has been proposed in \citet{wei2021theoretical}. It turns out that based on a teacher classifier on the source domain, the learned classifier can not only propagate to the target domain but also improve upon the teacher. By leveraging existing generalization bounds, we also obtain end-to-end finite-sample guarantees on deep neural networks. In addition, we extend our theoretical framework to a more general setting of source-to-target transfer based on an additional unlabeled dataset, which can be easily applied to various learning scenarios. Inspired by our theory, we adapt consistency-based semi-supervised learning methods to domain adaptation settings and gain significant improvements.

Author Information

Tianle Cai (Princeton University)
Ruiqi Gao (Princeton University)
Jason Lee (Princeton)
Qi Lei (Princeton University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors