Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Workshop on Formal Verification of Machine Learning

Formal Verification for Neural Networks with General Nonlinearities via Branch-and-Bound

Zhouxing Shi · Qirui Jin · Huan Zhang · Zico Kolter · Suman Jana · Cho-Jui Hsieh


Abstract:

Bound propagation with branch-and-bound (BaB) is so far among the most effective methods for neural network (NN) verification. However, existing works with BaB have mostly focused on NNs with piecewise linear activations only, especially ReLU neurons. In this paper, we develop a framework for conducting BaB based on bound propagation with general branching points and an arbitrary number of branches, as an important move for extending verification to models with various nonlinearities beyond ReLU. Our framework strengthens verification for common \emph{element-wise} activation functions, as well as other \emph{multi-dimensional} nonlinear operations that arise naturally in computation graphs, such as multiplication and division. In addition, we find that existing branching heuristics for choosing neurons to branch for ReLU networks are insufficient for general nonlinearities, and we design a new heuristic named BBPS, which outperforms the heuristic obtained by directly extending the existing ones originally developed for ReLU networks. We empirically demonstrate the effectiveness of our BaB framework on verifying a wide range of NNs, including networks with Sigmoid, Tanh, or Sin activations, LSTMs, and ViTs, which have various nonlinearities.

Chat is not available.