Learning Partial Concept Classes and Universal Rates Under Massart Noise
Abstract
The Massart noise condition is a central model in Probably Approximately Correct (PAC) learning theory. Its importance lies in it being an interpolation condition between realizable and the agnostic settings, under which one can attain faster rates than latter, and, under strict conditions, recover the rates of the former. Despite its importance, the Massart condition has not yet been fully explored in emerging extensions of statistical learning theory beyond the classical PAC framework. In this work, we present two such extensions. First, we revisit the transductive empirical risk minimization (TERM) algorithm of (Hanneke & Moran, 2026) and derive sharper excess error bounds under Massart noise using offset Rademacher techniques and local metric entropy introduced by (Zhivotovskiy & Hanneke, 2018). We then leverage this analysis to obtain new optimal sample complexity bounds for PAC learning with partial concept classes and complete the characterization of universal learning rates under Massart noise.