Timezone: »
Learning with adversarial robustness has been a challenge in contemporary machine learning, and recent years have witnessed increasing attention on robust decision trees and ensembles, mostly working with high computational complexity or without guarantees of provable robustness. This work proposes the Fast Provably Robust Decision Tree (FPRDT) with the smallest computational complexity O(n log n), a tradeoff between global and local optimizations over the adversarial 0/1 loss. We further develop the Provably Robust AdaBoost (PRAdaBoost) according to our robust decision trees, and present convergence analysis for training adversarial 0/1 loss. We conduct extensive experiments to support our approaches; in particular, our approaches are superior to those unprovably robust methods, and achieve better or comparable performance to those provably robust methods yet with the smallest running time.
Author Information
Jun-Qi Guo (Nanjing University)
Ming-Zhuo Teng (Nanjing University)
Wei Gao (Nanjing University)
Zhi-Hua Zhou (Nanjing University)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 Spotlight: Fast Provably Robust Decision Trees and Boosting »
Thu. Jul 21st 05:50 -- 05:55 PM Room Room 309
More from the Same Authors
-
2020 Poster: Safe Deep Semi-Supervised Learning for Unseen-Class Unlabeled Data »
Lan-Zhe Guo · Zhen-Yu Zhang · Yuan Jiang · Yu-Feng Li · Zhi-Hua Zhou -
2017 Poster: Multi-Class Optimal Margin Distribution Machine »
Teng Zhang · Zhi-Hua Zhou -
2017 Talk: Multi-Class Optimal Margin Distribution Machine »
Teng Zhang · Zhi-Hua Zhou