FedPAT: Federated Test-Time Adaptation via Prototype Affinity Topology
Abstract
Federated Learning (FL) enables privacy-preserving collaboration among distributed clients in open-world environments, but its performance often degrades under data heterogeneity and unpredictable distribution shifts. Test-Time Adaptation (TTA) has recently been introduced into FL to leverage unlabeled data from unseen clients for online adaptation. However, most existing federated TTA methods employ local feature statistics, which can be brittle under diverse and severe distribution shifts. In this work, we observe that despite significant variations in feature distributions, the relational structure among class prototypes—termed prototype affinity topology—remains remarkably stable across heterogeneous clients. Building on this insight, we propose FedPAT, a Federated TTA framework that leverages Prototype Affinity Topology as a cross-client structural prior. FedPAT learns a global PAT by aggregating class prototypes from source clients, capturing consensus inter-class relationships that are robust to local distribution variations. For unseen target clients, we design a topology-aware mechanism that enhances predictions via diffusion of the global PAT, fuses them with parametric outputs, and performs lightweight optimization for robust test-time adaptation. Extensive experiments demonstrate that FedPAT consistently outperforms advanced federated TTA and classical TTA methods across various distribution shifts.