Skip to yearly menu bar Skip to main content


Margin-aware Adversarial Domain Adaptation with Optimal Transport

Sofien Dhouib · Ievgen Redko · Carole Lartizien

Keywords: [ Transfer and Multitask Learning ] [ Transfer, Multitask and Meta-learning ]


In this paper, we propose a new theoretical analysis of unsupervised domain adaptation that relates notions of large margin separation, adversarial learning and optimal transport. This analysis generalizes previous work on the subject by providing a bound on the target margin violation rate, thus reflecting a better control of the quality of separation between classes in the target domain than bounding the misclassification rate. The bound also highlights the benefit of a large margin separation on the source domain for adaptation and introduces an optimal transport (OT) based distance between domains that has the virtue of being task-dependent, contrary to other approaches. From the obtained theoretical results, we derive a novel algorithmic solution for domain adaptation that introduces a novel shallow OT-based adversarial approach and outperforms other OT-based DA baselines on several simulated and real-world classification tasks.

Chat is not available.