Blocking the Leakage: Manifold-Aware Gradient Projection for Long-Horizon Test-Time Adaptation
Abstract
Test-Time Adaptation (TTA) empowers pre-trained models to adapt online to distribution shifts during inference, but such online updates often become unstable in long-horizon deployments. Prevailing approaches attribute this failure to error accumulation from noisy pseudo-labels, relying on heuristics to gate which samples are used for updates. We argue that this statistical view is insufficient: the problem lies not only in the quality of samples but also in the directionality of their gradients. In this work, we identify a geometric failure mode termed manifold erosion. Through spectral analysis, we find that reliable gradients concentrate in a stable low-rank subspace, while gradients from confident mispredictions are high-rank yet exhibit a persistent directional leakage into this protected subspace. This leakage can accumulate coherently and gradually erode core representations, eventually leading to collapse. To address this, we propose Manifold-Aware Gradient Projection (MGP), a geometric intervention that tracks the dominant subspace online and projects gradients onto its orthogonal complement. By blocking the leakage path, MGP decouples stability from plasticity. Extensive experiments on diverse TTA benchmarks demonstrate the long-horizon stability of our method, whereas prior methods often fail.