Skip to yearly menu bar Skip to main content


Poster

KnowFormer: Revisiting Transformers for Knowledge Graph Reasoning

Junnan Liu · Qianren Mao · Weifeng Jiang · Jianxin Li


Abstract:

Knowledge graph reasoning plays a vital role in various applications and has garnered considerable attention. Despite the impressive performance achieved by recent path-based methods, they may face limitations due to constraints in message-passing neural networks, including missing paths and information over-squashing. In this paper, we revisit the application of transformers for knowledge graph reasoning and propose a novel approach called KnowFormer, which utilizes a transformer architecture to conduct reasoning on knowledge graphs from a message-passing perspective, rather than by encoding textual information like previous transformer-based methods. By leveraging the all-pair interaction of the attention mechanism, KnowFormer effectively addresses the constraints faced by path-based methods. To achieve this, we introduce a query prototype based attention definition, facilitating convenient construction and efficient optimization. Furthermore, we introduce two sub-modules to enable the construction of a structure-aware self-attention. Additionally, we present an efficient attention computation method to enhance scalability. Experimental results demonstrate the superior performance of KnowFormer compared to prominent baseline methods on both transductive and inductive benchmarks.

Live content is unavailable. Log in and register to view live content