Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Next Generation of Sequence Modeling Architectures

HiPPO-Prophecy: State-Space Models can Provably Learn Dynamical Systems in Context

Federico Arangath Joseph · Noah Liniger · Kilian Haefeli


Abstract:

This work explores the in-context learning capabilities of State Space Models (SSMs) and, to thebest of our knowledge, provides the first theoretical explanation of a potential mechanism underlyingthem. We introduce a novel weight construction for Linear SSMs, allowing them to predict thenext state of any dynamical system after observing previous states, without requiring parameterfine-tuning. To do so, we study Linear SSMs under the HiPPO framework and extend it to show thatthey are capable of approximating the derivative of the input signal. We then derive an asymptoticerror bound on the approximation of this derivative. Discretizing the SSM results in a weightconstruction that predicts the next state of the dynamical system. Following this, we demonstrate theeffectiveness of our parametrization empirically. This work should serve as an initial step towards abetter understanding of how neural sequence models based on state space models learn in-context.

Chat is not available.