Provable guarantees for decision tree induction: the agnostic setting

Guy Blanc · Jane Lange · Li-Yang Tan

Keywords: [ Computational Learning Theory ] [ Learning Theory ]

[ Abstract ]
Tue 14 Jul 9 a.m. PDT — 9:45 a.m. PDT
Tue 14 Jul 8 p.m. PDT — 8:45 p.m. PDT

Abstract: We give strengthened provable guarantees on the performance of widely employed and empirically successful {\sl top-down decision tree learning heuristics}. While prior works have focused on the realizable setting, we consider the more realistic and challenging {\sl agnostic} setting. We show that for all monotone functions~$f$ and $s\in \mathbb{N}$, these heuristics construct a decision tree of size $s^{\tilde{O}((\log s)/\varepsilon^2)}$ that achieves error $\le \mathsf{opt}_s + \varepsilon$, where $\mathsf{opt}_s$ denotes the error of the optimal size-$s$ decision tree for $f$. Previously such a guarantee was not known to be achievable by any algorithm, even one that is not based on top-down heuristics. We complement our algorithmic guarantee with a near-matching $s^{\tilde{\Omega}(\log s)}$ lower bound.

Chat is not available.