The pioneering work of sparse local embeddings for extreme classification (SLEEC) (Bhatia et al., 2015) has shown great promise in multi-label learning. Unfortunately, the statistical rate of convergence and oracle property of SLEEC are still not well understood. To fill this gap, we present a unified framework for SLEEC with nonconvex penalty. Theoretically, we rigorously prove that our proposed estimator enjoys oracle property (i.e., performs as well as if the underlying model were known beforehand), and obtains a desirable statistical convergence rate. Moreover, we show that under a mild condition on the magnitude of the entries in the underlying model, we are able to obtain an improved convergence rate. Extensive numerical experiments verify our theoretical findings and the superiority of our proposed estimator.
Weiwei Liu (Wuhan University)
Xiaobo Shen (Nanjing University of Science and Technology)
Related Events (a corresponding poster, oral, or spotlight)
2019 Oral: Sparse Extreme Multi-label Learning with Oracle Property »
Tue Jun 11th 12:00 -- 12:05 PM Room Seaside Ballroom