Timezone: »

 
Poster
Transformers Meet Directed Graphs
Simon Markus Geisler · Yujia Li · Daniel Mankowitz · Taylan Cemgil · Stephan Günnemann · Cosmin Paduraru

Tue Jul 25 02:00 PM -- 04:30 PM (PDT) @ Exhibit Hall 1 #230

Transformers were originally proposed as a sequence-to-sequence model for text but have become vital for a wide range of modalities, including images, audio, video, and undirected graphs. However, transformers for directed graphs are a surprisingly underexplored topic, despite their applicability to ubiquitous domains, including source code and logic circuits. In this work, we propose two direction- and structure-aware positional encodings for directed graphs: (1) the eigenvectors of the Magnetic Laplacian — a direction-aware generalization of the combinatorial Laplacian; (2) directional random walk encodings. Empirically, we show that the extra directionality information is useful in various downstream tasks, including correctness testing of sorting networks and source code understanding. Together with a data-flow-centric graph construction, our model outperforms the prior state of the art on the Open Graph Benchmark Code2 relatively by 14.7%.

Author Information

Simon Markus Geisler (Technical University of Munich)
Yujia Li (DeepMind)
Daniel Mankowitz (Google)
Taylan Cemgil (DeepMind)
Stephan Günnemann (Technical University of Munich)
Cosmin Paduraru (DeepMind)

More from the Same Authors