RC-FCL: Combating Asynchronous Concept Drift in Federated Continual Learning via Retrospective Calibration
Abstract
Federated Continual Learning (FCL) enables the continuous acquisition of knowledge from streaming tasks, but inherently struggles with the temporal dynamics of client data distributions. These dynamics naturally induce asynchronous concept drift, where distribution shifts occur independently across clients at unsynchronized times and with varying magnitudes. Such asynchrony generates conflicting updates that destabilize global convergence and exacerbate catastrophic forgetting. However, existing FCL research focuses on static or incremental settings, typically treating all incoming updates uniformly, which obscures concept drift under divergent distributions and fails to adapt to the evolution of learned concepts. To address these limitations, we propose RC-FCL, a retrospective calibration framework for FCL that can effectively distinguish asynchronous concept drift and adjust the learning strategy adaptively. Specifically, RC-FCL leverages a conditional generative model to synthesize class-conditional reference distributions of previously learned concepts for local drift detection. It calibrates local adaptation using a weighting mechanism driven by the local discriminator to prioritize informative samples, and executes a global aggregation strategy based on drift magnitude. Our experimental results demonstrate that RC-FCL achieves competitive performance against state-of-the-art methods.