Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Next Generation of Sequence Modeling Architectures

Reservoir Memory Networks: Long-range temporal dependencies with untrained RNNs

Claudio Gallicchio · Andrea Ceni


Abstract:

We introduce Reservoir Memory Networks (RMNs), a novel class of Reservoir Computing (RC) models designed to enhance long-term information retention in Recurrent Neural Networks (RNNs) without training the dynamic components. By integrating a linear memory cell with a non-linear reservoir, RMNs efficiently manage long-term dependencies, a traditional challenge in standard RC approaches. Our empirical analysis on time-series classification and pixel-level 1-D classification tasks demonstrate that RMNs not only outperform conventional reservoir models but also exhibit competitive accuracy compared to fully trainable RNNs. These results highlight RMNs' potential as a computationally efficient alternative for handling complex sequential tasks.

Chat is not available.