Learnability-Driven Knowledge Assimilation for Class-Incremental Semantic Segmentation
Abstract
Class-incremental semantic segmentation learns new classes while retaining old ones without access to past data. Although existing methods alleviate catastrophic forgetting on old classes, new-class performance remains limited. We identify the key bottleneck arises from low-margin regions, where the logit of the ground-truth class is close to that of the most competitive non-ground-truth class. Our theoretical analysis suggests that optimization in these regions is characterized by high curvature and a small stability radius, making learning prone to class confusion. Based on the above analysis, we propose Learnability-Driven Knowledge Assimilation (LDKA), which targets low-margin learning via three complementary optimization strategies: (i) Progressive Margin Learning continuously reallocates pixel-wise optimization budget in a threshold-free manner, shifting emphasis from high-margin to low-margin regions; (ii) Smooth Knowledge Distillation applies curvature damping and perturbation stabilization to suppress high-frequency updates and increase stability radius; (iii) Misclassification-Aware Decoupling measures inter-class confusion with a competition matrix and decouples highly competitive class representations. Experiments show that LDKA improves mIoU on new classes while preserving performance on old classes across 9 incremental protocols.