Timezone: »

Agnostic Learning of Halfspaces with Gradient Descent via Soft Margins
Spencer Frei · Yuan Cao · Quanquan Gu

Wed Jul 21 06:00 AM -- 06:20 AM (PDT) @ None

We analyze the properties of gradient descent on convex surrogates for the zero-one loss for the agnostic learning of halfspaces. We show that when a quantity we refer to as the \textit{soft margin} is well-behaved---a condition satisfied by log-concave isotropic distributions among others---minimizers of convex surrogates for the zero-one loss are approximate minimizers for the zero-one loss itself. As standard convex optimization arguments lead to efficient guarantees for minimizing convex surrogates of the zero-one loss, our methods allow for the first positive guarantees for the classification error of halfspaces learned by gradient descent using the binary cross-entropy or hinge loss in the presence of agnostic label noise.

Author Information

Spencer Frei (UCLA)
Yuan Cao (UCLA)
Quanquan Gu (University of California, Los Angeles)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors