Just Y-Prediction: Enabling Historical Cumulative Inconsistency in Label Diffusion for Learning with Noisy Label
Senyu Hou ⋅ Gaoxia Jiang ⋅ Xinyi Zheng ⋅ Yaqing Guo ⋅ Shuna Liang ⋅ Wenjian Wang
Abstract
Label noise is pervasive in real-world datasets and significantly compromises model generalization, fueling extensive research into Learning with Noisy Labels (LNL). Most LNL methods focus on robust discriminative learning, while recent generative classifiers such as label diffusion models (LDMs) show superior robustness by modeling class posteriors. However, current LDMs predominantly rely on standard $\epsilon$-prediction, where Gaussian pnoise lacks explicit class semantics, limiting both optimization and inference under label noise environments. To address this issue, we propose just y-prediction (JYP), a novel training paradigm that enables LDMs to directly characterize the label manifold and leverage explicit class-semantic guidance. Theoretically, we prove that JYP converges to an optimal solution equivalent to that of $\epsilon$-prediction within the label diffusion framework, while facilitating accelerated convergence and enabling one-step inference. Leveraging JYP as a foundation, we further incorporate historical cumulative inconsistency to adaptively tailor optimization strategies for clean, noisy, and hard samples. Extensive experiments demonstrate that our method consistently outperforms competitors across diverse synthetic noisy datasets and achieves state-of-the-art performance on multiple real-world benchmarks.
Successful Page Load