Rank-guided Diffusion for Noise Few-Shot Learning
Abstract
In real-world Few-Shot Learning (FSL), support sets are quickly constructed and inevitably contain noisy samples. With limited examples per class, even a single noisy instance can distort class distributions, cause prototype drift, and reduce generalization. Existing methods mostly assume clean data or require large-scale statistics, which are impractical in FSL’s data-scarce setting. We find that clean samples in semantic feature space lie in low-rank subspaces, while noisy samples cause rank anomalies disrupting this structure. To address this, we propose a differentiable low-rank approximation that estimates the intrinsic rank of the support set and detects anomalous noisy samples. Building on this, a rank-guided diffusion process generates high-quality replacements under low-rank constraints, reconstructing a clean, consistent support set for improved robustness.This low-rank guided approach effectively mitigates prototype drift and significantly reduces errors under noise levels up to 40% across MiniImageNet, TieredImageNet, and other noisy benchmarks, demonstrating the power of low-rank geometry for noise detection and correction in FSL.