PROVEN: Verifying Robustness of Neural Networks with a Probabilistic Approach
Tsui-Wei Weng · Pin-Yu Chen · Lam Nguyen · Mark Squillante · Akhilan Boopathy · Ivan Oseledets · Luca Daniel

Tue Jun 11th 12:15 -- 12:20 PM @ Grand Ballroom

With the prevalence of deep neural networks, quantifying their robustness to adversarial inputs has become an important area of research. However, most of the current research literature merely focuses on the \textit{worst-case} setting that computes certified lower bounds of minimum adversarial distortion when the input perturbations are constrained within an $\ellp$ ball, thus lacking robustness assessment beyond the certified range. In this paper, we provide a first look at a \textit{probabilistically} certifiable setting where the perturbation can follow a given distributional characterization. We propose a novel framework \proven to \textbf{PRO}babilistically \textbf{VE}rify \textbf{N}eural network's robusntess with statistical guarantees -- i.e., \proven certifies the probability that the classifier's top-1 prediction cannot be altered under any constrained $\ellp$ norm perturbation to a given input. Notably, \proven is derived from closed-form analysis of current state-of-the-art worst-case neural network robustness verification frameworks, and therefore it can provide probabilistic certificates with little computational overhead on top of existing methods such as Fast-Lin, CROWN and CNN-Cert. Experiments on small and large MNIST and CIFAR neural network models demonstrate our probabilistic approach can tighten up to around $1.8 \times$ and $3.5 \times$ in the robustness certification with at least a $99.99\%$ confidence compared with the worst-case robustness certificate delivered by CROWN and CNN-Cert.

Author Information

Lily Weng (MIT)
Pin-Yu Chen (IBM Research AI)
Lam Nguyen (IBM Research, Thomas J. Watson Research Center)
Mark Squillante (IBM Research)
Akhilan Boopathy (MIT)
Ivan Oseledets (Skolkovo Institute of Science and Technology)
Luca Daniel (Massachusetts Institute of Technology)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors