Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Geometry-grounded Representation Learning and Generative Modeling

Geometric algebra transformers for large 3D meshes via cross-attention

Julian Suk · Pim de Haan · Baris Imre · Jelmer Wolterink

Keywords: [ group equivariance ] [ Geometric algebra ] [ biomedical engineering ] [ transformers ]


Abstract:

Surface and volume meshes of 3D anatomical structures are widely used in biomedical engineering and medicine. The advent of machine learning enabled viable applications which come with the unique challenge of applying deep neural networks to large 3D meshes. In this work, we scale the recently introduced geometric algebra transformers (GATr) to meshes with hundreds of thousands of vertices by projection to a coarser set of vertices via cross-attention. The resulting neural network inherits GATr's equivariance under rotation, translation and reflection, which are desirable properties when dealing with 3D objects.

Chat is not available.