Align Your Trajectory Tangent: Training Better Consistency Models via Manifold-Aligned Tangents
Abstract
With diffusion and flow matching models achieving state-of-the-art generating performance, the interest of the community now turned to reducing the inference time without sacrificing sample quality. Consistency Models (CMs), which are trained to be consistent on diffusion or probability flow ordinary differential equation (PF-ODE) trajectories, enable one or two-step flow or diffusion sampling. However, CMs typically require prolonged training with large batch sizes to obtain competitive sample quality. In this paper, we examine the training dynamics of CMs near convergence and discover that CM trajectory tangents -- CM output update directions -- are quite oscillatory, in the sense that they move parallel to the data manifold, not towards the manifold. To mitigate oscillatory trajectory tangents, we propose a new loss function, called the {\em manifold feature distance (MFD)}, which provides manifold-aligned trajectory tangents that point toward the data manifold. Consequently, our method -- dubbed {\em Align Your Trajectory Tangent (AYT)} -- can accelerate CM training by orders of magnitude and even out-perform the learned perceptual image patch similarity metric (LPIPS). Furthermore, we find that our loss enables training with extremely small batch sizes without compromising sample quality.