Skip to yearly menu bar Skip to main content


Poster

Online Continual Learning from Imbalanced Data

Aristotelis Chrysakis · Marie-Francine Moens

Keywords: [ Representation Learning ] [ Supervised Learning ] [ Transfer and Multitask Learning ] [ Transfer, Multitask and Meta-learning ]


Abstract:

A well-documented weakness of neural networks is the fact that they suffer from catastrophic forgetting when trained on data provided by a non-stationary distribution. Recent work in the field of continual learning attempts to understand and overcome this issue. Unfortunately, the majority of relevant work embraces the implicit assumption that the distribution of observed data is perfectly balanced, despite the fact that, in the real world, humans and animals learn from observations that are temporally correlated and severely imbalanced. Motivated by this remark, we aim to evaluate memory population methods that are used in online continual learning, when dealing with highly imbalanced and temporally correlated streams of data. More importantly, we introduce a new memory population approach, which we call class-balancing reservoir sampling (CBRS). We demonstrate that CBRS outperforms the state-of-the-art memory population algorithms in a considerably challenging learning setting, over a range of different datasets, and for multiple architectures.

Chat is not available.