Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Rule-Enhanced Graph Learning

Ali · Abdolreza Mirzaei · Majjid Farhadi · Parmis Naddaf · Kiarash Zahirnia · Mohammad Salameh · Kevin Cannons · Richard Mar · Mingyi Wu · Oliver Schulte

Keywords: [ Neuro-symbolic AI ] [ Variational Graph Auto-encoder ] [ Statistical-Relational Learning ] [ Deep Graph Generative Model ]


Abstract:

Knowledge-enhanced graph learning is one of the current frontiers for neural models of graph data. In this paper, we propose a new approach to enhancing deep generative models with domain knowledge that is represented by first-order logic rules. First-order logic provides an expressive formalism for representing interpretable knowledge about relational structures. Our approach builds on ideas from statistical-relational learning (SRL), a field of machine learning that aims to combine first-order logic with statistical models. One of the fundamental concepts in SRL is {\em rule moment matching}: constrain model training such that the expected instance count of each rule matches its observed instance count. We adapt this idea for deep generative models by maximizing the (approximate) model likelihood subject to the rule moment matching constraint. Our main algorithmic contribution is a novel method for computing the expected rule instance count of a Variational Graph Autoencoder (VGAE), based on matrix multiplication. Empirical evaluation on four benchmark datasets shows that rule moment matching improves the quality of generated graphs substantially (by orders of magnitude on standard graph quality metrics).

Chat is not available.