LoBCD-GW: A Fast and Data-Dependent Algorithm for Computing Gromov-Wasserstein Distance via Localized Block Coordinate Descent
Jingni Song ⋅ Jiawei Huang ⋅ Kangke Cheng ⋅ Bangxian Han ⋅ Hu Ding
Abstract
The Gromov-Wasserstein (GW) distance provides a powerful framework for aligning structured data by comparing the intrinsic geometries of metric measure spaces, and has become a fundamental tool in machine learning. Most existing methods leverage entropy regularization to reduce the computational complexity to $\boldsymbol{\mathrm{O}}(n^3)$, where $n$ is the number of samples. However, this cubic time complexity remains a major bottleneck in large-scale applications, severely limiting the scalability. To address this challenge, we propose LoBCD-GW, an efficient GW optimization algorithm. Specifically, we reveal the data-dependent sparsity of large-magnitude updates to the coupling matrix and introduce a localized block coordinate selection strategy. This confines the optimization to a "selected set" of size $r$ (which is a parameter that depends on the given data set, and usually is much less than $n$), thereby reducing the complexity to $\boldsymbol{\mathrm{O}}(r^3)$. In addition, unlike prior acceleration methods often based on constraint relaxation, our method can guarantee the strict feasibility through a novel "marginal compensation mechanism" to synchronize local mass redistribution with global constraints. Finally, we conduct a set of experiments on various datasets, and the results demonstrate that our method achieves a $160\times$ speedup on large-scale graph alignment benchmarks, while maintaining state-of-the-art accuracy.
Successful Page Load