Poster
in
Workshop: The Second Workshop on Spurious Correlations, Invariance and Stability
Uncertainty-Guided Online Test-Time Adaptation via Meta-Learning
kyubyung chae · Taesup Kim
In real-world scenarios, machine learning systems may continually experience distributional shifts due to many different factors in the test environment, which makes the predictions unreliable. For this reason, it is important to learn a model that can robustly adapt to the environment in an online manner. In this work, we propose to meta-learn how to guide unsupervised online adaptation by taking into account the uncertainty in the predictions. Generally, all unlabeled test samples are equally incorporated for online test-time adaptation. However, uncertain samples can negatively affect the adaptation performance. Thus, we enable the model to adaptively learn test samples by quantifying the uncertainty during test-time online adaptation. We experimentally show that our uncertainty-guided online adaptation provides improved robustness and adaptation performance during test-time on image classification tasks with some distributional shift.