Provably Data-driven Lagrangian Relaxation for Mixed Integer Linear Programming
TUNG LE ⋅ Anh Nguyen ⋅ Viet Anh Nguyen
Abstract
Lagrangian Relaxation (LR) is a powerful technique for solving large-scale Mixed Integer Linear Programming (MILP), particularly those with decomposable structures like Vehicle Routing or Unit Commitment. By relaxing coupling constraints, LR enables parallel solving of subproblems and frequently yields tighter dual bounds than standard linear programming relaxations, which is crucial for efficient branch-and-bound pruning. While recent empirical works showed promising results using machine learning to predict these multipliers, a theoretical understanding of such methods remains an open question. In this work, we bridge this gap by analyzing the problem of learning LR through the lens of Data-driven Algorithm Design, i.e., a statistical learning problem over a distribution of problem instances. Our contributions are as follows: first, we derive a generalization bound of $\cO(s^{1.5}/\sqrt{N})$ for the learned multipliers, where $s$ is the number of coupled constraints and $N$ is the sample size. Second, we provide a minimax lower-bound of $\Omega(s/\sqrt{N})$, proving that a linear dependency is unavoidable. Finally, we extend our framework to the problem of learning to warm-start sub-gradient ascent.
Successful Page Load