Timezone: »

 
Poster
Relative Deviation Margin Bounds
Corinna Cortes · Mehryar Mohri · Ananda Theertha Suresh

Wed Jul 21 09:00 AM -- 11:00 AM (PDT) @
We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. e give two types of learning bounds, in terms of either the Rademacher complexity or the empirical $\ell_\infty$-covering number of the hypothesis set used, both distribution-dependent and valid for general families. Furthermore, using our relative deviation margin bounds, we derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment. We also briefly highlight several applications of these bounds and discuss their connection with existing results.

Author Information

Corinna Cortes (Google Research)
Mehryar Mohri (Google Research and Courant Institute of Mathematical Sciences)
Ananda Theertha Suresh (Google Research)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors