Skip to yearly menu bar Skip to main content


Poster
in
Workshop: HiLD: High-dimensional Learning Dynamics Workshop

Elephant Neural Networks: Born to Be a Continual Learner

Qingfeng Lan · Rupam Mahmood


Abstract:

Catastrophic forgetting remains a great challenge to continual learning for decades.While recent works have proposed effective methods to mitigate this problem, they mainly focus on the algorithmic side.Meanwhile, we do not fully understand what architectural properties of neural networks lead to catastrophic forgetting.This study aims to fill this gap by studying the role of activation functions in the training dynamics of neural networks and their impact on catastrophic forgetting.Our study reveals that, besides sparse representations, the gradient sparsity of activation functions also plays an important role in reducing forgetting.Based on this insight, we propose a new class of activation functions, elephant activation functions, that can generate both sparse representations and sparse gradients.We show that the resilience of neural networks to forgetting can be significantly improved by simply replacing classical activation functions with elephant activation functions.

Chat is not available.