Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Test-time Adaptation with Diffusion Models

Mihir Prabhudesai · Tsung-Wei Ke · Alexander Li · Deepak Pathak · Katerina Fragkiadaki

Keywords: [ test-time adaptation ] [ Classification ] [ Diffusion Models ] [ Generative Models ]


Abstract:

We find that generative models can be great test-time adapters for discriminative models. We propose a method to adapt pre-trained classifiers and large-scale CLIP models to individual unlabelled images by modulating the text conditioning of a text-conditional pretrained image diffusion model and maximizing the image likelihood using end-to-end backpropagation to the classifier parameters. We improve the classification accuracy of various pretrained classifiers on various datasets, including ImageNet and its variants. Further we show that our approach significantly outperforms previous test-time adaptation methods. To the best of our knowledge, this is the first work that adapts pre-trained large-scale discriminative models to individual images; all previous works require co-training under joint discriminative and self-supervised objectives, to apply at test time, which prevents them from adapting readily available models.

Chat is not available.