Diffusion Flow Matching: Dimension-Improved KL Bounds and Wasserstein Guarantees
Abstract
Diffusion Flow Matching (DFM) has recently emerged as a versatile framework for generative modeling, yet its theoretical convergence properties remain only partially understood. In this work, we provide refined and novel convergence guarantees for Brownian motion based DFMs, focusing on the discretization error. Our analysis is conducted under the Kullback–Leibler (KL) divergence and the 2-Wasserstein distance. Under finite-moment and mild integrability assumptions, we derive KL convergence bounds with improved dimensional dependence compared to prior work, achieving, up to our knowledge, state-of-the-art scaling under minimal conditions. We further extend the analysis to the 2-Wasserstein distance: assuming weak log-concavity and one-sided Lipschitz continuity, we obtain convergence guarantees with dimensional dependence consistent with the KL case.