Skip to yearly menu bar Skip to main content


Poster

Convex and Bilevel Optimization for Neural-Symbolic Inference and Learning

Charles Dickens · Changyu Gao · Connor Pryor · Stephen Wright · Lise Getoor


Abstract: We leverage convex and bilevel optimization techniques to develop a general gradient-based parameter learning framework for neural-symbolic (NeSy) systems.We demonstrate our framework with NeuPSL, a state-of-the-art NeSy architecture.To achieve this, we propose a smooth primal and dual formulation of NeuPSL inference and show learning gradients are functions of the optimal dual variables.Additionally, we develop a dual block coordinate descent algorithm for the new formulation that naturally exploits warm-starts. This leads to over $100 \times$ learning runtime improvements over the current best NeuPSL inference method.Finally, we provide extensive empirical evaluations across $8$ datasets covering a range of tasks and demonstrate our learning framework achieves up to a $16$% point prediction performance improvement over alternative learning methods.

Live content is unavailable. Log in and register to view live content