Skip to yearly menu bar Skip to main content


Poster

Slow and Steady Wins the Race: Maintaining Plasticity with Hare and Tortoise Networks

Hojoon Lee · Hyeonseo Cho · Hyunseung Kim · Donghu Kim · Dugki Min · Jaegul Choo · Clare Lyle


Abstract:

This study delves into the loss of generalization ability in neural networks, revisiting warm-starting experiments from Ash \& Adams (2020). Our empirical analysis reveals that common methods designed to enhance plasticity by maintaining trainability have a limited gain on generalization. While reinitializing the network was effective, it also risks losing valuable prior knowledge. To this end, we introduce the Hare \& Tortoise, inspired by the brain's complementary learning system. Hare \& Tortoise consists of two components: the Hare network, which rapidly updates information like the hippocampus, and the Tortoise network, which gradually integrates knowledge akin to the neocortex. By periodically reinitializing the Hare network to the Tortoise's weights, it preserves plasticity while retaining generalizable knowledge.Hare \& Tortoise can effectively maintain the network's plasticity to generalize, which improves advanced reinforcement learning algorithms on the Atari-100k benchmark.

Live content is unavailable. Log in and register to view live content