Skip to yearly menu bar Skip to main content


Poster
in
Workshop: A Blessing in Disguise: The Prospects and Perils of Adversarial Machine Learning

Fast Certified Robust Training with Short Warmup

Zhouxing Shi · Yihan Wang · Huan Zhang · Jinfeng Yi · Cho-Jui Hsieh


Abstract:

State-of-the-art (SOTA) methods for certified robust training including interval bound propagation (IBP) and CROWN-IBP usually use a long warmup schedule with hundreds or thousands epochs and are thus costly. In this paper, we identify two important issues, namely exploded bounds at initialization, and the imbalance in ReLU activation states, which make certified training difficult and unstable, and thereby long warmup was previously needed. For fast training with short warmup, we propose three improvements, including a weight initialization for IBP training, fully adding Batch Normalization (BN), and regularization during warmup to tighten certified bounds and balance ReLU activation states. With a short warmup for fast training, we are already able to outperform literature SOTA trained with hundreds or thousands epochs under the same network architecture.

Chat is not available.