Skip to yearly menu bar Skip to main content


Poster

Learning Solution-Aware Transformers for Efficiently Solving Quadratic Assignment Problem

Zhentao Tan · Yadong Mu

Hall C 4-9
[ ]
Wed 24 Jul 2:30 a.m. PDT — 4 a.m. PDT

Abstract:

Recently various optimization problems, suchas Mixed Integer Linear Programming Problems(MILPs), have undergone comprehensive investigation, leveraging the capabilities of machinelearning. This work focuses on learning-basedsolutions for efficiently solving the Quadratic Assignment Problem (QAPs), which stands as aformidable challenge in combinatorial optimization. While many instances of simpler problemsadmit fully polynomial-time approximate solution (FPTAS), QAP is shown to be strongly NPhard. Even finding a FPTAS for QAP is difficult, in the sense that the existence of a FPTASimplies P = NP. Current research on QAPssuffer from limited scale and computational inefficiency. To attack the aforementioned issues,we here propose the first solution of its kind forQAP in the learn-to-improve category. This workencodes facility and location nodes separately,instead of forming computationally intensive association graphs prevalent in current approaches.This design choice enables scalability to largerproblem sizes. Furthermore, a Solution AWareTransformer (SAWT) architecture integrates theincumbent solution matrix with the attention scoreto effectively capture higher-order information ofthe QAPs. Our model’s effectiveness is validatedthrough extensive experiments on self-generatedQAP instances of varying sizes and the QAPLIBbenchmark.

Live content is unavailable. Log in and register to view live content