Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Workshop on Formal Verification of Machine Learning

Formal Control Synthesis for Stochastic Neural Network Dynamic Models

Steven Adams · Morteza Lahijanian · Luca Laurenti


Abstract:

Neural networks (NNs) are emerging as powerful tools to represent the dynamics of control systems with complicated physics or black-box components. Due to complexity of NNs, however, existing methods are unable to synthesize complex behaviors with guarantees for NN dynamic models (NNDMs). This work introduces a control synthesis framework for stochastic NNDMs with performance guarantees. The focus is on specifications expressed in linear temporal logic interpreted over finite traces (LTLf), and the approach is based on finite abstraction. Specifically, we leverage recent techniques for convex relaxation of NNs to formally abstract a NNDM into an interval Markov decision process (IMDP). Then, a strategy that maximizes the probability of satisfying a given specification is synthesized over the IMDP and mapped back to the underlying NNDM. We show that the process of abstracting NNDMs to IMDPs reduces to a set of convex optimization problems, hence guaranteeing efficiency. We also present an adaptive refinement procedure that makes the framework scalable. On several case studies, we illustrate the our framework is able to provide non-trivial guarantees of correctness for NNDMs with architectures of up to 5 hidden layers and hundreds of neurons per layer.

Chat is not available.