DTKG: Dual-Track Knowledge Graph-Verified Reasoning Framework for Multi-Hop QA
Abstract
Multi-hop reasoning for question answering (QA) plays a critical role in retrieval-augmented generation (RAG) for large language models (LLMs). Based on inherent relation-dependency and reasoning patterns, it is categorized into parallel fact-verification (simultaneously verifying independent sub-questions) and chained reasoning (sequential multi-step inference). Existing approaches adopt either LLM-based fact verification or KG path-based chain construction, failing to handle both categories well: the former underperforms on chained reasoning, while the latter suffers from redundant paths in parallel tasks. Inspired by the Dual Process Theory in cognitive science and Stanovich’s Cognitive Misers Theory, we propose an effective multi-hop QA framework DTKG (Dual-Track Knowledge Graph) through building a two-stage pipeline: i) Classification Stage (dynamic question categorization via few-shot prompting, emulating "unconscious processing"); and ii) Branch Processing Stage (tailored reasoning paths, emulating "conscious processing"). Multi-facet experiments on six datasets show DTKG achieves 5.0\%-29.5\% performance improvement. The code is available at https://anonymous.4open.science/r/DTKG-621F