Skip to yearly menu bar Skip to main content


Poster

Simulation of Graph Algorithms with Looped Transformers

Artur Back de Luca · Kimon Fountoulakis

Hall C 4-9 #816
[ ]
Thu 25 Jul 2:30 a.m. PDT — 4 a.m. PDT

Abstract:

The execution of graph algorithms using neural networks has recently attracted significant interest due to promising empirical progress. This motivates further understanding of how neural networks can replicate reasoning steps with relational data. In this work, we study the ability of transformer networks to simulate algorithms on graphs from a theoretical perspective. The architecture we use is a looped transformer with extra attention heads that interact with the graph. We prove by construction that this architecture can simulate individual algorithms such as Dijkstra's shortest path, Breadth- and Depth-First Search, and Kosaraju's strongly connected components, as well as multiple algorithms simultaneously. The number of parameters in the networks does not increase with the input graph size, which implies that the networks can simulate the above algorithms for any graph. Despite this property, we show a limit to simulation in our solution due to finite precision. Finally, we show a Turing Completeness result with constant width when the extra attention heads are utilized.

Live content is unavailable. Log in and register to view live content