Timezone: »
The pioneering work of sparse local embeddings on multilabel learning has shown great promise in multilabel classification. Unfortunately, the statistical rate of convergence and oracle property of sparse local embeddings are still not well understood. To fill this gap, we present a unified framework for this method with nonconvex penalty. Theoretically, we rigorously prove that our proposed estimator enjoys oracle property (i.e., performs as well as if the underlying model were known beforehand), and obtains a desirable statistical convergence rate. Moreover, we show that under a mild condition on the magnitude of the entries in the underlying model, we are able to obtain an improved convergence rate. Extensive numerical experiments verify our theoretical findings and the superiority of our proposed estimator.
Author Information
Weiwei Liu (Wuhan University)
Xiaobo Shen (Nanjing University of Science and Technology)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Sparse Extreme Multi-label Learning with Oracle Property »
Wed. Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom #126
More from the Same Authors
-
2022 : Robustness Verification for Contrastive Learning »
Zekai Wang · Weiwei Liu -
2023 Poster: Better Diffusion Models Further Improve Adversarial Training »
Zekai Wang · Tianyu Pang · Chao Du · Min Lin · Weiwei Liu · Shuicheng YAN -
2023 Poster: Delving into Noisy Label Detection with Clean Data »
Chenglin Yu · Xinsong Ma · Weiwei Liu -
2023 Poster: DDGR: Continual Learning with Deep Diffusion-based Generative Replay »
Rui Gao · Weiwei Liu -
2023 Oral: Delving into Noisy Label Detection with Clean Data »
Chenglin Yu · Xinsong Ma · Weiwei Liu -
2022 Poster: Robustness Verification for Contrastive Learning »
Zekai Wang · Weiwei Liu -
2022 Oral: Robustness Verification for Contrastive Learning »
Zekai Wang · Weiwei Liu -
2020 Poster: Adaptive Adversarial Multi-task Representation Learning »
YUREN MAO · Weiwei Liu · Xuemin Lin