Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Sampling and Optimization in Discrete Space

Accelerating Diffusion-based Combinatorial Optimization Solvers by Progressive Distillation

Junwei Huang · Zhiqing Sun · Yiming Yang


Abstract: Graph-based diffusion models have shown promising results in terms of generating high-quality solutions to NP-complete (NPC) combinatorial optimization (CO) problems. However, those models are often inefficient in inference, due to the iterative evaluation nature of the denoising diffusion process. This paper proposes to use $\textit {progressive}$ distillation to speed up the inference by taking fewer steps (e.g., forecasting two steps ahead within a single step) during the denoising process. Our experimental results show that the progressively distilled model can perform inference $\textbf{16}$ times faster with only $\textbf{0.019}$% degradation in performance on the TSP-50 dataset.

Chat is not available.