Timezone: »

Relative Deviation Margin Bounds
Corinna Cortes · Mehryar Mohri · Ananda Theertha Suresh

Wed Jul 21 06:40 PM -- 06:45 PM (PDT) @
We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. e give two types of learning bounds, in terms of either the Rademacher complexity or the empirical $\ell_\infty$-covering number of the hypothesis set used, both distribution-dependent and valid for general families. Furthermore, using our relative deviation margin bounds, we derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment. We also briefly highlight several applications of these bounds and discuss their connection with existing results.

Author Information

Corinna Cortes (Google Research)
Mehryar Mohri (Google Research and Courant Institute of Mathematical Sciences)
Ananda Theertha Suresh (Google Research)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors