Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Next Generation of Sequence Modeling Architectures

Q-S5 Towards Quantized State Space Models

Steven Abreu · Jens Egholm Pedersen · Kade Heckel · Alessandro Pierro


Abstract:

In the quest for next-generation sequence modeling architectures, State Space Models (SSMs) have emerged as a potent alternative to transformer architectures, particularly noted for their computational efficiency and suitability for dynamical systems. This paper investigates the effect of quantization on the S5 model to better understand its impact on model performance and to facilitate its deployment to edge and resource-constrained platforms. Using quantization-aware training (QAT) and post-training quantization (PTQ), we systematically evaluate the quantization sensitivity of SSMs across different tasks like dynamical systems modeling, Sequential MNIST (SMNIST) and most of the Long Range Arena (LRA). We present fully quantized S5 models whose test accuracy drops less than 1% on SMNIST and most of the LRA. We find that performance on most tasks degrades significantly for recurrent weights below 8-bit precision, but that other components can be compressed further without significant loss of performance. Our results further show that PTQ only performs well on language-based LRA tasks whereas all others require QAT. Our investigation provides necessary insights for the continued development of efficient and hardware-optimized SSMs.

Chat is not available.