Timezone: »

 
Oral
Optimality Implies Kernel Sum Classifiers are Statistically Efficient
Raphael Meyer · Jean Honorio

Tue Jun 11 04:35 PM -- 04:40 PM (PDT) @ Room 103

We propose a novel combination of optimization tools with learning theory bounds in order to analyze the sample complexity of optimal classifiers. This contrasts the typical learning theoretic results which hold for all (potentially suboptimal) classifiers. Our work also justifies assumptions made in prior work on multiple kernel learning. As a byproduct of this analysis, we provide a new form of Rademacher hypothesis sets for considering optimal classifiers.

Author Information

Raphael Meyer (Purdue University)

Graduating with BS from Purdue University, Spring 2019. Joining NYU Tandon, Fall 2019.

Jean Honorio (Purdue University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors