Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Principles of Distribution Shift (PODS)

Style Balancing and Test-Time Style Shifting for Domain Generalization

Jungwuk Park · Dong-Jun Han · Soyeong Kim · Jaekyun Moon


Abstract:

Recent works on domain generalization have shown great success by generating new feature statistics (or style statistics) during training, which enables the model to get exposed to diverse domains or styles. However, existing works suffer from cross-domain class imbalance problem, that naturally arises in domain generalization problems. The performance of previous works are also degraded when the gap between the style statistics of source and target domains is large (i.e., when the distribution shift is large in the feature-level style space). In this paper, we propose new strategies to improve robustness against potential domain shift. We first propose style balancing, which strategically balances the number of samples for each class across all source domains, to improve domain diversity during training. Then we propose test-time style shifting, which shifts the style of the test sample (that has a large style gap with the source domains) to the nearest source domain to improve the prediction performance.

Chat is not available.