Skip to yearly menu bar Skip to main content


Poster

Learning Solution-Aware Transformers for Efficiently Solving Quadratic Assignment Problem

Zhentao Tan · Yadong Mu


Abstract: Recently various optimization problems, such as Mixed Integer Linear Programming Problems (MILPs), have undergone comprehensive investigation, leveraging the capabilities of machine learning. This work focuses on learning-based solutions for efficiently solving the Quadratic Assignment Problem (QAPs), which stands as a formidable challenge in combinatorial optimization.While many instances of simpler problems admit $\epsilon$-approximate algorithms, QAP is shown to be strongly NP-hard. Even finding an $\epsilon$-approximate solution for QAP is difficult, in the sense that the existence of a polynomial $\epsilon$-approximation algorithm implies $P = NP$. Current research on QAPs suffer from limited scale and computational inefficiency. To attack aforementioned issues, we here propose the first solution of its kind for QAP in the learn-to-improve category. This work encodes facility and location nodes separately, instead of forming computationally intensive association graphs prevalent in current approaches. This design choice enables scalability to larger problem sizes. Furthermore, a Solution Aware Transformer (SAT) architecture integrates the incumbent solution matrix with the attention score to effectively capture higher-order information of the QAPs. Our model's effectiveness is validated through extensive experiments on self-generated QAP instances of varying sizes and the QAPLIB benchmark.

Live content is unavailable. Log in and register to view live content