Timezone: »

 
Oral
Bandit Multiclass Linear Classification: Efficient Algorithms for the Separable Case
Alina Beygelzimer · David Pal · Balazs Szorenyi · Devanathan Thiruvenkatachari · Chen-Yu Wei · Chicheng Zhang

Thu Jun 13 10:15 AM -- 10:20 AM (PDT) @ Room 102
We study the problem of efficient online multiclass linear classification with bandit feedback, where all examples belong to one of $K$ classes and lie in the $d$-dimensional Euclidean space. Previous works have left open the challenge of designing efficient algorithms with finite mistake bounds when the data is linearly separable by a margin $\gamma$. In this work, we take a first step towards this problem. We consider two notions of linear separability, \emph{strong} and \emph{weak}. 1. Under the strong linear separability condition, we design an efficient algorithm that achieves a near-optimal mistake bound of $O\left(\frac{K}{\gamma^2} \right)$. 2. Under the more challenging weak linear separability condition, we design an efficient algorithm with a mistake bound of $2^{\widetilde{O}(\min(K \log^2 \frac{1}{\gamma}, \sqrt{\frac{1}{\gamma}} \log K))}$ \footnote{We use the notation $\widetilde{O}(f(\cdot)) = O(f(\cdot) \polylog(f(\cdot)))$.}. Our algorithm is based on kernel Perceptron, which is inspired by the work of \citet{Klivans-Servedio-2008} on improperly learning intersection of halfspaces.

Author Information

Alina Beygelzimer (Yahoo Research)
David Pal (Expedia)
Balazs Szorenyi (Yahoo Research)
Devanathan Thiruvenkatachari (New York University)
Chen-Yu Wei (University of Southern California)
Chicheng Zhang (Microsoft Research)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors