Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Geometry-grounded Representation Learning and Generative Modeling

Sheaf Diffusion Goes Nonlinear: Enhancing GNNs with Adaptive Sheaf Laplacians

Olga Zaghen · Antonio Longa · Steve Azzolin · Lev Telyatnikov · Andrea Passerini · Pietro LiĆ³

Keywords: [ cellular sheaf ] [ Sheaf Neural Networks ] [ Topological Deep Learning ] [ Graph Neural Networks ] [ Graph machine learning ]


Abstract:

Sheaf Neural Networks (SNNs) have recently been introduced to enhance Graph Neural Networks (GNNs) in their capability to learn from graphs. Previous studies either focus on linear sheaf Laplacians or hand-crafted nonlinear sheaf Laplacians. The former are not always expressive enough in modeling complex interactions between nodes, such as antagonistic dynamics and bounded confidence dynamics, while the latter use a fixed nonlinear function that is not adapted to the data at hand. To enhance the capability of SNNs to capture complex node-to-node interactions while adapting to different scenarios, we propose a Nonlinear Sheaf Diffusion (NLSD) model, which incorporates nonlinearity into the Laplacian of SNNs through a general function learned from data. Our model is validated on a synthetic community detection dataset, where it outperforms linear SNNs and common GNN baselines in a node classification task, showcasing its ability to leverage complex network dynamics.

Chat is not available.