Skip to yearly menu bar Skip to main content


Poster

MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts

Jianan Zhou · Zhiguang Cao · Yaoxin Wu · Wen Song · Yining Ma · Jie Zhang · Xu Chi

Hall C 4-9 #1003
[ ] [ Project Page ]
Thu 25 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract:

Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark instances. We further conduct extensive studies on the effect of MoE configurations in solving VRPs, and observe the superiority of hierarchical gating when facing out-of-distribution data. The source code is available at: https://github.com/RoyalSkye/Routing-MVMoE.

Live content is unavailable. Log in and register to view live content