Lightweight Federated Incremental Learning via Decoupled Replay
Abstract
Federated Incremental Learning (FIL) aims to learn streaming tasks across distributed clients without catastrophic forgetting while preserving privacy. Most existing methods focus on sample-based replay techniques, which mitigate forgetting by replaying historical data samples. However, such methods often face challenges related to data privacy risks and significant resource overheads, making them impractical and difficult to deploy on edge devices with limited resources. To address this challenge, we propose a novel and \underline{Li}ghtweight \underline{F}ederated \underline{I}ncremental \underline{L}earning framework called \textbf{Li-FIL} that leverages dense features synthesized by a secure generator on the server to enable efficient feature-based replay on decoupled local models. More specifically, each client extracts high-confidence features from the new task, applies mixup to obtain a dense feature representation, and then privatizes these features before uploading them to the server, which reduces both storage and communication overhead. A generator is deployed on the server to learn the distributions of different clients and generate global features for replay. Moreover, to enable clients to better learn from these dense features, we decouple the local model into two components: a feature extractor and a classifier. This design allows feature replay and the alignment between new and previous features to be conducted separately and more effectively. Extensive experiments demonstrate that Li-FIL outperforms other state-of-the-art methods by up to 10.14 in terms of accuracy on both old and new tasks with superior resource efficiency.