Skip to yearly menu bar Skip to main content


Invited Talk
in
Workshop: Sampling and Optimization in Discrete Space

Lianhui Qin: Differentiable and structured text reasoning


Abstract:

Text reasoning and generation in practice often needs to meet complex objectives, integrate diverse contextual constraints, and ground in logical structures for consistency. Current large LMs can produce fluent text and follow human instructions, but they still struggle to effectively optimize toward specific objectives. The discrete nature of text poses one of the key challenges to the optimization. In this talk, I will present our work on optimizing text reasoning and generation with continuous and discrete methods. I will first introduce COLD, a unified energy-based framework that empowers any off-the-shelf LMs to reason with any objectives in a continuous space. This approach brings forward differentiable reasoning over discrete text, thus improving efficiency. Following this, I will discuss Maieutic prompting, a method that enhances the logical consistency of neural reasoning in a discrete space by integrating with logical structures.

Chat is not available.