Recursive Binding on a Budget: Subspace Carving in Order-$p$ Tensor Memories
Travis Pence ⋅ Daisuke Yamada ⋅ Vikas Singh
Abstract
Tensor Product Representations provide the structural fidelity required for symbolic reasoning in models but suffer from *exponential* dimensionality growth when encoding deep recursive structures. Conversely, Vector Symbolic Architectures maintain *constant* dimensionality but sacrifice capacity and fidelity due to noisy compression via superposition. In this work, we propose **Orthogonal Subspace Carving (OSC)**, a memory architecture that binds *fillers* to *roles* by projecting onto the null space of the role basis before aggregating into a fixed order-$p$ tensor. OSC uses projections to enforce geometric orthogonality between bound structures within a {\em static} memory trace. We show that this mechanism decouples the tensor order from the structural depth, enabling deep recursive binding within a *constant* memory footprint. This construction allows for component vectors that are *orders of magnitude* smaller than the memory tensor, giving excellent memory efficiency in settings involving high superposition. We also show that TPR is a special case of binding in Clifford algebra, and give a Clifford formulation of OSC.
Successful Page Load