Skip to yearly menu bar Skip to main content


Poster

Federated Continual Learning via Prompt-based Dual Knowledge Transfer

Hongming Piao · Yichen WU · Dapeng Wu · Ying WEI


Abstract:

In Federated Continual Learning (FCL), the challenge lies in effectively facilitating knowledge transfer and enhancing the performance across various tasks on different clients. Current FCL methods predominantly focus on avoiding interference between tasks, thereby overlooking the potential for positive knowledge transfer across tasks learned by different clients at separate time intervals. To address this issue, we introduce a Prompt-based Knowledge Transfer FCL algorithm, called PKT-FCL, designed to effectively foster the transfer of knowledge encapsulated in prompts between various sequentially learned tasks and clients. Furthermore, we have devised a unique approach for prompt generation and aggregation, intending to alleviate privacy protection concerns and communication overhead, while still promoting knowledge transfer. Comprehensive experimental results demonstrate the superiority of our method in terms of reduction in communication costs, and enhancement of knowledge transfer

Live content is unavailable. Log in and register to view live content