A Fine-Grained Understanding of Uniform Convergence for Halfspaces
Aryeh Kontorovich ⋅ Kasper Green Larsen
Abstract
We study the fine-grainded uniform convergence behavior of halfspaces beyond worst-case VC bounds. For inhomogeneous halfspaces in $\mathbb{R}^d$ with $d\ge 2$, we show that standard first-order VC bounds are essentially tight: even consistent hypotheses can incur population error $\Theta(d\log(n/d)/n)$, and in the agnostic setting the deviation scales as $\sqrt{\tau\log(1/\tau)}$ at true error $\tau$. In contrast, homogeneous halfspaces in $\mathbb{R}^2$ exhibit a markedly different behavior. In the realizable case, every hypothesis consistent with the sample has error $O(1/n)$. In the agnostic case, we prove a bandwise, log-free deviation bound on each dyadic risk band via a critical-wedge localization argument. Unioning over bands incurs only a $\log\log n$ overhead, and we establish a matching lower bound showing this overhead is unavoidable. Together, these results give a fine-grained and nearly complete picture of uniform convergence for halfspaces, revealing sharp dimensional and structural thresholds.
Successful Page Load