Ramba: Selective State-Space Models for Relational Deep Learning
Abstract
Relational Deep Learning aims to learn directly on multi-table databases, yet current methods face a fundamental tension: Transformers' quadratic complexity prohibits the large contexts relational data demands, while GNNs sacrifice global context for efficiency. We introduce Ramba, the first selective state-space model for relational databases. Our approach features two innovations: (1) Topology-Aware Linearization, which processes cells via global columnar serialization in O(L) complexity while recovering relational structure through sparse entity and foreign-key attention masks; and (2) Schema Dynamic Gating, which modulates SSM state transitions based on semantic alignment between the currently scanned attribute and the prediction target, enabling cross-table relevance filtering without relying on value distributions. Together, these enable Ramba to ingest vast relational contexts while selectively retaining semantically relevant information, a combination existing architectures cannot achieve. Experiments demonstrate state-of-the-art performance with linear scalability across diverse relational benchmarks.