Skip to yearly menu bar Skip to main content


Poster

A Difference Standardization Method for Mutual Transfer Learning

Haoqing Xu · Meng Wang · Beilun Wang

#516

Keywords: [ OPT: Large Scale, Parallel and Distributed ] [ MISC: Supervised Learning ] [ T: Domain Adaptation and Transfer Learning ] [ MISC: Scalable Algorithms ] [ MISC: Transfer, Multitask and Meta-learning ]


Abstract:

In many real-world applications, mutual transfer learning is the paradigm that each data domain can potentially be a source or target domain. This is quite different from transfer learning tasks where the source and target are known a priori. However, previous studies about mutual transfer learning either suffer from high computational complexity or oversimplified hypothesis. To overcome these challenges, in this paper, we propose the \underline{Diff}erence \underline{S}tandardization method ({\bf DiffS}) for mutual transfer learning. Specifically, we put forward a novel distance metric between domains, the standardized domain difference, to obtain fast structure recovery and accurate parameter estimation simultaneously. We validate the method’s performance using both synthetic and real-world data. Compared to previous methods, DiffS demonstrates a speed-up of approximately 3000 times that of similar methods and achieves the same accurate learnability structure estimation.

Chat is not available.