Poster
Statistical and Computational Guarantees of Kernel Max-Sliced Wasserstein Distances
Jie Wang · March Boedihardjo · Yao Xie
East Exhibition Hall A-B #E-1806
In "Statistical and Computational Guarantees of Kernel Max-Sliced Wasserstein Distances", Wang, Boedihardjo, and Xie introduce a rigorous theoretical and algorithmic framework for the Kernel Max-Sliced (KMS) Wasserstein distance, a flexible and powerful tool for comparing probability distributions in high-dimensional spaces. KMS generalizes the max-sliced Wasserstein distance by replacing linear projections with nonlinear projections in a Reproducing Kernel Hilbert Space (RKHS), capturing nonlinear differences between distributions more effectively.The authors provide dimension-free, finite-sample guarantees for the KMS p-Wasserstein distance, showing it converges at the optimal rate under mild assumptions. On the computational side, they prove that computing the KMS 2-Wasserstein distance is NP-hard and thus develop a tractable semidefinite relaxation (SDR) formulation with provable approximation bounds and efficient first-order optimization algorithms. Notably, they also establish a novel rank bound on the SDR solutions and propose a rank-reduction procedure for improved interpretability and performance.Extensive experiments demonstrate the superior performance of this framework in high-dimensional two-sample testing, human activity change detection, and generative modeling. The KMS Wasserstein distance outperforms various baselines, including MMD, Sinkhorn divergence, and sliced Wasserstein distances, especially when the data exhibits nonlinear structures.
Live content is unavailable. Log in and register to view live content