Skip to yearly menu bar Skip to main content


Contributed talk
in
Workshop: Geometry-grounded Representation Learning and Generative Modeling

Contributed: Bundle Neural Networks for message diffusion on graphs

Jacob Bamberger

[ ] [ Project Page ]
Sat 27 Jul 1:40 a.m. PDT — 1:45 a.m. PDT

Abstract:

The dominant paradigm for learning on graph-structured data is message passing. Despite being a strong inductive bias, the local message passing mechanism suffers from pathological issues such as over-smoothing, over-squashing, and limited node-level expressivity. To address these limitations we propose Bundle Neural Networks (BuNN), a new type of GNN that operates via message diffusion over flat vector bundles – structures analogous to connections on Riemannian manifolds that augment the graph by assigning to each node a vector space and an orthogonal map. We show that BuNNs can mitigate over-smoothing and over-squashing, and that they are universal compact uniform approximators on graphs. We showcase the strong empirical performance of BuNNs over real-world tasks, achieving state-of-the-art results on several standard benchmarks.

Chat is not available.