Timezone: »

On the Universality of Linear Recurrences Followed by Nonlinear Projections
Antonio Orvieto · Soham De · Razvan Pascanu · Caglar Gulcehre · Samuel Smith

We show that a family of sequence models based on recurrent linear layers (including S4, S5, and the LRU) interleaved with position-wise multi-layer perceptions (MLPs) can approximate arbitrarily well any sufficiently regular non-linear sequence-to-sequence map. The main idea behind our result is to see recurrent layers as compression algorithms that can faithfully store information about the input sequence in an inner state before it is processed by the highly expressive MLP.

Author Information

Antonio Orvieto (ETH Zurich)
Soham De (Google DeepMind)
Razvan Pascanu (DeepMind)
Caglar Gulcehre (DeepMind)
Samuel Smith (DeepMind)

More from the Same Authors