Timezone: »

Efficient Test-Time Model Adaptation without Forgetting
Shuaicheng Niu · Jiaxiang Wu · Yifan Zhang · Yaofo Chen · Shijian Zheng · Peilin Zhao · Mingkui Tan

Tue Jul 19 03:30 PM -- 05:30 PM (PDT) @ Hall E #509

Test-time adaptation provides an effective means of tackling the potential distribution shift between model training and inference, by dynamically updating the model at test time. This area has seen fast progress recently, at the effectiveness of handling test shifts. Nonetheless, prior methods still suffer two key limitations: 1) these methods rely on performing backward computation for each test sample, which takes a considerable amount of time; and 2) these methods focus on improving the performance on out-of-distribution test samples and ignore that the adaptation on test data may result in a catastrophic forgetting issue, \ie, the performance on in-distribution test samples may degrade. To address these issues, we propose an efficient anti-forgetting test-time adaptation (EATA) method. Specifically, we devise a sample-efficient entropy minimization loss to exclude uninformative samples out of backward computation, which improves the overall efficiency and meanwhile boosts the out-of-distribution accuracy. Afterward, we introduce a regularization loss to ensure that critical model weights tend to be preserved during adaptation, thereby alleviating the forgetting issue. Extensive experiments on CIFAR-10-C, ImageNet-C, and ImageNet-R verify the effectiveness and superiority of our EATA.

Author Information

Shuaicheng Niu (South China University of Technology)
Jiaxiang Wu (Tencent AI Lab)
Yifan Zhang (National University of Singapore)
Yaofo Chen (South China University of Technology)
Shijian Zheng (South China University of Technology)
Peilin Zhao (Tencent AI Lab)
Mingkui Tan (South China University of Technology)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors