Skip to yearly menu bar Skip to main content


Poster

AdaNPC: Exploring Non-Parametric Classifier for Test-Time Adaptation

Yi-Fan Zhang · xue wang · Kexin Jin · Kun Yuan · Zhang Zhang · Liang Wang · Rong Jin · Tieniu Tan

Exhibit Hall 1 #209
[ ]
[ PDF [ Poster

Abstract: Many recent machine learning tasks focus to develop models that can generalize to unseen distributions. Domain generalization (DG) has become one of the key topics in various fields. Several literatures show that DG can be arbitrarily hard without exploiting target domain information. To address this issue, test-time adaptive (TTA) methods are proposed. Existing TTA methods require offline target data or extra sophisticated optimization procedures during the inference stage. In this work, we adopt **N**on-**P**arametric **C**lassifier to perform the test-time **Ada**ptation (**AdaNPC**). In particular, we construct a memory that contains the feature and label pairs from training domains. During inference, given a test instance, AdaNPC first recalls $k$ closed samples from the memory to vote for the prediction, and then the test feature and predicted label are added to the memory. In this way, the sample distribution in the memory can be gradually changed from the training distribution towards the test distribution with very little extra computation cost. We theoretically justify the rationality behind the proposed method. Besides, we test our model on extensive numerical experiments. AdaNPC significantly outperforms competitive baselines on various DG benchmarks. In particular, when the adaptation target is a series of domains, the adaptation accuracy of AdaNPC is $50$% higher than advanced TTA methods.

Chat is not available.