Poster

Relative Deviation Margin Bounds

Corinna Cortes · Mehryar Mohri · Ananda Theertha Suresh

Keywords: [ Models of Learning and Generalization ]

[ Abstract ]
[ Paper ]
[ Visit Poster at Spot D6 in Virtual World ]
Wed 21 Jul 9 a.m. PDT — 11 a.m. PDT
 
Spotlight presentation: Learning Theory 10
Wed 21 Jul 6 p.m. PDT — 7 p.m. PDT

Abstract: We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. e give two types of learning bounds, in terms of either the Rademacher complexity or the empirical $\ell_\infty$-covering number of the hypothesis set used, both distribution-dependent and valid for general families. Furthermore, using our relative deviation margin bounds, we derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment. We also briefly highlight several applications of these bounds and discuss their connection with existing results.

Chat is not available.