Timezone: »

Mixed-Curvature Transformers for Graph Representation Learning
Sungjun Cho · Seunghyuk Cho · Sungwoo Park · Hankook Lee · Honglak Lee · Moontae Lee
Event URL: https://openreview.net/forum?id=DFnk58DwTE »

Real-world graphs naturally exhibit hierarchical or cyclical structures that are unfit for the typical Euclidean space. While there exist graph neural networks that leverage non-Euclidean spaces to embed such structures more accurately, these methods are confined under the message-passing paradigm, making the models vulnerable against side-effects such as oversmoothing. More recent work have proposed attention-based graph Transformers that can easily model long-range interactions, but their extensions towards non-Euclidean geometry are yet unexplored. To bridge this gap, we propose Fully Product-Stereographic Transformer, a generalization of Transformers towards operating entirely on the product of constant curvature spaces. Our model can learn the curvature appropriate for the input graph in an end-to-end fashion, without the need of additional tuning on different curvature initializations. We also provide a kernelized approach to non-Euclidean attention, which enables our model to run in cost linear to the number of nodes and edges while respecting the underlying geometry. Experiments on graph reconstruction and node classification demonstrate the benefits of our approach.

Author Information

Sungjun Cho (LG AI Research)
Seunghyuk Cho (LG AI Research)
Sungwoo Park (LG AI Research)
Hankook Lee (KAIST)
Honglak Lee (LG AI Research / U. Michigan)
Moontae Lee (University of Illinois at Chicago)

More from the Same Authors