Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Formal Verification of Machine Learning

Optimized Symbolic Interval Propagation for Neural Network Verification

Philipp Kern · Marko Kleine Büning · Carsten Sinz


Abstract:

Neural networks are increasingly applied in safety critical domains, their verification thus is gaining importance.A large class of recent algorithms for proving input-output relations offeed-forward neural networks are based on linear relaxations andsymbolic interval propagation. However, due to variable dependencies, theapproximations deteriorate with increasing depth of the network.In this paper we present DPNeurifyFV, a novel branch-and-bound solver for ReLU networks with low dimensional input-space that is based on symbolic interval propagation with fresh variables and input-splitting.A new heuristic for choosing the fresh variables allows to ameliorate the dependency problem, while our novel splitting heuristic, in combination with several other improvements, speeds up the branch-and-bound procedure.We evaluate our approach on the airborne collision avoidance networks ACAS Xu and demonstrate runtime improvements compared to state-of-the-art tools.

Chat is not available.