Accelerated Multiple Wasserstein Gradient Flows for Multi-objective Distributional Optimization
DaiHai Nguyen ⋅ Duc-Dung NGUYEN ⋅ Atsuyoshi Nakamura ⋅ Hiroshi Mamitsuka
Abstract
We study multi-objective optimization over probability distributions in Wasserstein space. Recently, \citet{nguyen2025multiple} introduced Multiple Wasserstein Gradient Descent (MWGraD) algorithm, which exploits the geometric structure of Wasserstein space to jointly optimize multiple objectives. Building on this approach, we propose an accelerated variant, A-MWGraD, inspired by Nesterov's acceleration. We analyze the continuous-time dynamics and establish convergence to weakly Pareto optimal points in probability space. Our theoretical results show that A-MWGraD achieves a convergence rate of $\mathcal{O}(1/t^2)$ for geodesically convex objectives and $\mathcal{O}(e^{-\sqrt{\beta}t})$ for $\beta$-strongly geodesically convex objectives, improving upon the $\mathcal{O}(1/t)$ rate of MWGraD in the geodesically convex setting. We further introduce a practical kernel-based discretization for A-MWGraD and demonstrate through numerical experiments that it consistently outperforms MWGraD in convergence speed and sampling efficiency on multi-target sampling tasks.
Successful Page Load