Skip to yearly menu bar Skip to main content


Poster
in
Workshop: ICML 2024 Workshop on Foundation Models in the Wild

Leveraging Generative Foundation Models for Domain Generalization

Sobhan Hemati · Mahdi Beitollahi · Amir Estiri · Bassel Al Omari · Xi Chen · Guojun Zhang

Keywords: [ Diffusion Models ] [ Generative Foundation Models ] [ domain generalization ]


Abstract:

There has been a huge effort to tackle the Domain Generalization (DG) problem with a focus on developing new loss functions. Inspired by the capabilities of the diffusion models, we pose a pivotal question: Can diffusion models function as data augmentation tools to address DG from a data-centric perspective, rather than relying on the loss functions? We show that trivial cross domain data augmentation (CDGA) along with the vanilla ERM using readily available diffusion models outperforms state-of-the-art (SOTA) DG algorithms. To justify the success of CDGA, we experimentally show that CDGA reduces the distribution shift between domains which is the main reason behind the lack of out-of-distribution (OOD) generalization of ERM under domain shift. These results advocate for further investigation into the potential of SOTA generative models for tackling the representation learning problem.

Chat is not available.