Timezone: »

Formal Verification for Neural Networks with General Nonlinearities via Branch-and-Bound
Zhouxing Shi · Qirui Jin · Huan Zhang · Zico Kolter · Suman Jana · Cho-Jui Hsieh

Fri Jul 28 02:30 PM -- 03:00 PM (PDT) @

Bound propagation with branch-and-bound (BaB) is so far among the most effective methods for neural network (NN) verification. However, existing works with BaB have mostly focused on NNs with piecewise linear activations only, especially ReLU neurons. In this paper, we develop a framework for conducting BaB based on bound propagation with general branching points and an arbitrary number of branches, as an important move for extending verification to models with various nonlinearities beyond ReLU. Our framework strengthens verification for common \emph{element-wise} activation functions, as well as other \emph{multi-dimensional} nonlinear operations that arise naturally in computation graphs, such as multiplication and division. In addition, we find that existing branching heuristics for choosing neurons to branch for ReLU networks are insufficient for general nonlinearities, and we design a new heuristic named BBPS, which outperforms the heuristic obtained by directly extending the existing ones originally developed for ReLU networks. We empirically demonstrate the effectiveness of our BaB framework on verifying a wide range of NNs, including networks with Sigmoid, Tanh, or Sin activations, LSTMs, and ViTs, which have various nonlinearities.

Author Information

Zhouxing Shi (UCLA)
Qirui Jin (University of Michigan)
Huan Zhang (UIUC)
Zico Kolter (Carnegie Mellon University / Bosch Center for AI)
Suman Jana (Columbia University)
Cho-Jui Hsieh (UCLA)

More from the Same Authors