Hyperbolic Associative Memory Networks
Abstract
Modern Hopfield Networks (MHNs) have achieved widespread success across various domains but are confined to Euclidean/Hilbert spaces, failing to preserve the hierarchical structure of data due to geometric constraints—arbitrary tree structures cannot be embedded with low distortion, while hyperbolic spaces can naturally accommodate hierarchical structures through exponential volume growth. To address this issue, we propose Hyperbolic Associative Memory Networks (HAMNs), the first framework to embed modern associative memory into hyperbolic space: we map query and memory vectors from Euclidean space to a constant negative curvature manifold via exponential maps, define a regularized energy function based on the Minkowski inner product, and adopt curvature-aware Riemannian optimization combined with exponential map updates to achieve stable on-manifold retrieval. We put forward a hierarchy-sensitivity hypothesis—HAMNs outperform Euclidean MHNs on data with deep hierarchies but exhibit comparable performance on data with weak or shallow hierarchies, which is validated by depth-controlled experiments and cross-level metrics. As a plug-and-play, model-agnostic module, HAMNs are suitable for the storage and retrieval of representations in task architectures requiring hierarchical understanding, instantiated with the Poincaré ball in experiments, and also applicable to any hyperbolic model with constant negative curvature.