Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML)

Episodic Memory Theory of Recurrent Neural Networks: Insights into Long-Term Information Storage and Manipulation

Arjun Karuvally · Peter DelMastro · Hava Siegelmann


Abstract:

Recurrent neural networks (RNNs) have emerged as powerful models capable of storing and manipulating external information over long periods in various domains. Yet, the mechanisms that underly this behavior remain a mystery due to the black-box nature of these models. This paper addresses this question by proposing an episodic memory theory of RNN dynamics, enabling a more comprehensive understanding of the RNN weights as memories and inter-memory interactions. This approach sheds light on the inner workings of RNNs and connects to existing research on memory representation and organization. The theory extends the current linearization approaches by providing alternative interpretations of the eigenspectrum and its connection to the long-term storage and manipulation of information. We discuss how the segregation, representation, and composition of the variable binding problem—a fundamental question in cognitive science and artificial intelligence—can be mechanistically interpreted within the theory. Using an elementary task - repeat copy, we demonstrate the validity of the theory in experimental settings. Our work represents a step towards opening the black box of RNNs, offering new insights into their functionality and bridging the gap between recurrent neural networks and memory models.

Chat is not available.