Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Geometry-grounded Representation Learning and Generative Modeling

Multivector Neurons: Better and Faster O(n)-Equivariant Clifford GNNs

Cong Liu · David Ruhe · Patrick Forré

Keywords: [ Equivariance ] [ Geometric algebra ] [ Graph Neural Networks ] [ Clifford Algebra ] [ Geometric Deep Learning ]


Abstract: Recent works have focused on designing deep learning models that are equivariant to the $O(n)$ or $SO(n)$ groups. These models either consider only scalar information such as distances and angles or have a very high computational complexity. In this work, we test a few novel message passing graph neural networks (GNNs) based on Clifford multivectors, structured similarly to prevalent equivariant models in geometric deep learning. Our approach leverages efficient invariant scalar features while simultaneously performing expressive learning on multivector representations, particularly through the use of the equivariant geometric product operator. By integrating these elements, our methods outperform established efficient baseline models on an N-Body simulation task and protein denoising task while maintaining a high efficiency.In particular, we push the state-of-the-art error on the N-body dataset to 0.0035 (averaged over 3 runs); an 8\% improvement over recent methods.

Chat is not available.