Poster
in
Workshop: ICML 2024 Workshop on Foundation Models in the Wild
BUILD: Buffer-free Incremental Learning with OOD Detection for the Wild
Srishti Gupta · Daniele Angioni · Lea Schönherr · Ambra Demontis · Battista Biggio
Keywords: [ Buffer-free ] [ out-of-distribution detection ] [ Privacy ] [ Transformer ] [ continual learning ] [ Incremental learning ]
Having a model that can dynamically learn new classes while detecting Out-of-Distribution (OOD) samples is a desirable property for most applications operating in the wild. While there is limited work in this direction, some works have attempted to achieve both by combining Incremental Learning (IL) and OOD detection, showing promising results for both tasks. Most of the works use a buffer containing some samples to either replay past samples while learning or to detect outliers at testing, which can cause potential issues: 1) it does not scale well with a growing number of samples, 2) it causes privacy issues as storing samples may not always be a compliant option, 3) it limits the outlier detection to the distribution in the buffer, and 4) it is computationally and memory expensive.In this work, we tackle this issue with a very simple yet effective framework: BUILD which performs both IL and OOD detection in a buffer-free manner with the capability to work in the wild.BUILD integrates a pre-trained vision transformer that is fine-tuned with hard attention masks, along with post-hoc OOD detectors applied during testing.We show that BUILD when combined with activation-based post-hoc OOD technique, can give not just competitive but better performance than the SOTA baselines. To support our claims, we evaluate the proposed framework on the CIFAR-10 classification benchmark and the results show that BUILD gives superior and stabler performance in detecting OOD samples in computationally much cheaper way.