Timezone: »

 
Causal Balancing for Domain Generalization
Xinyi Wang · Michael Saxon · Jiachen Li · Hongyang Zhang · Kun Zhang · William Wang
Event URL: https://openreview.net/forum?id=imav8hheb2M »

While machine learning models rapidly advance the state-of-the-art on various real-world tasks, out-of-domain (OOD) generalization remains a challenging problem given the vulnerability of these models to spurious correlations. We propose a balanced mini-batch sampling strategy to reduce the domain-specific spurious correlations in the observed training distributions. More specifically, we propose a two-phased method that 1) identifies the source of spurious correlations, and 2) builds balanced mini-batches free from spurious correlations by matching on the identified source. We provide an identifiability guarantee of the source of spuriousness and show that our proposed approach samples from a balanced, spurious-free distribution under ideal scenario. Experiments are conducted on three domain generalization datasets, demonstrating empirically that our balanced mini-batch sampling strategy improves the performance of four different established domain generalization model baselines compared to the random mini-batch sampling strategy.

Author Information

Xinyi Wang (University of California, Santa Barbara)
Michael Saxon (UC Santa Barbara)
Jiachen Li (University of California, Santa Barbara)
Hongyang Zhang (University of Waterloo)
Kun Zhang (Carnegie Mellon University)
William Wang (UCSB)

More from the Same Authors