Skip to yearly menu bar Skip to main content


Poster

Distribution Free Domain Generalization

Peifeng Tong · Wu Su · He Li · Jialin Ding · zhan haoxiang · Song Chen

Exhibit Hall 1 #735
[ ]
[ PDF [ Poster

Abstract:

Accurate prediction of the out-of-distribution data is desired for a learning algorithm. In domain generalization, training data from source domains tend to have different distributions from that of the target domain, while the target data are absence in the training process. We propose a Distribution Free Domain Generalization (DFDG) procedure for classification by conducting standardization to avoid the dominance of a few domains in the training process. The essence of the DFDG is its reformulating the cross domain/class discrepancy by pairwise two sample test statistics, and equally weights their importance or the covariance structures to avoid dominant domain/class. A theoretical generalization bound is established for the multi-class classification problem. The DFDG is shown to offer a superior performance in empirical studies with fewer hyperparameters, which means faster and easier implementation.

Chat is not available.