Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Workshop on Formal Verification of Machine Learning

Robustness Verification for Perception Models against Camera Motion Perturbations

Hanjiang Hu · Changliu Liu · Ding Zhao


Abstract:

Robust perception is still challenging due to the internal vulnerability of DNNs to adversarial examples as well as the external uncertainty of sensing data, e.g. sensor placement and motion perturbation. Recent work can only give provable robustness guarantees in a probabilistic way which is not enough for safety-critical scenarios due to false positive certificates. To this end, we propose the first deterministic provable defense framework against camera motion by extending the verification of neural networks (VNN) method from lp bounded perturbation to parameterized camera motion space for robotics applications. Through the dense partitions of image projection from 3D dense point cloud to fully cover all the pixels, all the pixel values can be bounded by linear relaxations using linear programming, which makes the camera motion perturbation verifiable and compatible with current incomplete and complete formal VNN methods given DNN models. Extensive experiments are conducted on the Metaroom dataset for the dense image projection and our sound and complete method is more computationally efficient than the randomized smoothing based method at small perturbation radii.

Chat is not available.