Forgetting Whenever You Want: A Decentralized Continual Learning Framework with On-Demand Unlearning
Abstract
Decentralized class continual learning refers to a paradigm where distributed clients continuously acquire new classes while retaining previously learned information without relying on a central server. With increasing emphasis on privacy preservation, there is a growing need for on-demand unlearning, introducing two key challenges: Historical Class Unlearning and Network-Wide Knowledge Entanglement. In this work, we propose a decentralized continual learning framework with on-demand unlearning (DCU), which is the first attempt at achieving class continual learning and arbitrary-time class unlearning in a distributed setting. Specifically, our proposed DCU comprises three main stages: prototypes extraction, prototype-guided continual learning, and unlearning with disposable prototypes. Firstly, the prototypes extraction mechanism is designed to capture the class-specific concepts as lightweight, disposable embeddings. Then, the synthetic data guided by these prototypes can be combined with real data to achieve incremental learning through distillation. Besides, synthetic samples with noisy label are used to guide the adjustment of the model's decision boundary, effectively erasing the influence of the target class while preserving other classes' knowledge. Extensive experiments conducted on two datasets demonstrate the effectiveness of our DCU in dynamic learning and target class unlearning.