Capacity-Agnostic Parameter Isolation for Continual Graph Learning
Abstract
Existing parameter isolation-based methods in continual learning employ diverse designs to learn more tasks within a limited model capacity. However, most of their designs inevitably incur substantial computational overhead if their model capacity is enlarged to accommodate further tasks as the task stream continually grows, resulting in a significant efficiency bottleneck. In this paper, we propose a novel GNN framework with a biological neuron-inspired architecture, termed the capacity-agnostic GNN (CAGNN), to simultaneously overcome catastrophic forgetting and boost efficiency under capacity expansion. Unlike other methods that employ full network propagation, CAGNN leverages graph contextual information to support the construction of task-specific subnetworks and decouples subnetworks during both training and inference, while enabling effective knowledge transfer between tasks. Intensive experiments demonstrate CAGNN's superiority to the state of the art, in terms of effectiveness as well as computational efficiency.