Skip to yearly menu bar Skip to main content


Poster

MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts

Jianan Zhou · Zhiguang Cao · Yaoxin Wu · Wen Song · Yining Ma · Jie Zhang · Xu Chi


Abstract:

Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes the zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark. We further provide extensive studies on the effect of MoE configurations in solving VRPs.

Live content is unavailable. Log in and register to view live content