Timezone: »

CLIPood: Generalizing CLIP to Out-of-Distributions
Yang Shu · Xingzhuo Guo · Jialong Wu · Ximei Wang · Jianmin Wang · Mingsheng Long

Wed Jul 26 05:00 PM -- 06:30 PM (PDT) @ Exhibit Hall 1 #101

Out-of-distribution (OOD) generalization, where the model needs to handle distribution shifts from training, is a major challenge of machine learning. Contrastive language-image pre-training (CLIP) models have shown impressive zero-shot ability, but the further adaptation of CLIP on downstream tasks undesirably degrades OOD performances. This paper aims at generalizing CLIP to out-of-distribution test data on downstream tasks. We propose CLIPood, a fine-tuning method that can adapt CLIP models to OOD situations where both domain shifts and open classes may occur on the unseen test data. To exploit the semantic relations between classes from the text modality, CLIPood introduces a new training objective, margin metric softmax (MMS), with class adaptive margins for fine-tuning. To incorporate both pre-trained zero-shot model and fine-tuned task-adaptive model, CLIPood leverages a new optimization strategy, Beta moving average (BMA), to maintain a temporal ensemble weighted by Beta distribution. Experiments on diverse datasets with different OOD scenarios show that CLIPood consistently outperforms existing generalization techniques.

Author Information

Yang Shu (Tsinghua University)
Xingzhuo Guo (Tsinghua University)
Jialong Wu (Tsinghua University)
Ximei Wang (Tencent)
Jianmin Wang (Tsinghua University)
Mingsheng Long (Tsinghua University)

More from the Same Authors