The Accumulation of Score Estimation Error in Diffusion Models
Abstract
Diffusion models are widely used for high-quality generation, but their performance is sensitive to the accuracy of the estimated score. We first develop our main bounds in a Gaussian-mixture setting, where the score admits a closed-form structure and the score Hessian can be controlled explicitly, leading to sharp Wasserstein estimates. We then extend the analysis to general data distributions, which yields a more general but typically looser upper bound. This general bound can be sharpened under mild regularity: when the initial distribution has a globally Lipschitz score, the curvature contribution at small times is uniformly bounded, avoiding the worst-case blow-up. Our results make precise how discretization choices govern the accumulation of score error, aligning with empirical observations on the benefits of certain step-size schedules. The results hold for both variance-preserving (VP) and variance-exploding (VE) diffusions, and apply to both the reverse-time SDE and the associated probability-flow ODE.