Skip to yearly menu bar Skip to main content


Poster
in
Workshop: The Synergy of Scientific and Machine Learning Modelling (SynS & ML) Workshop

Optimization or Architecture: What Matters in Non-Linear Filtering?

Ido Greenberg · Netanel Yannay · Shie Mannor

Keywords: [ Cholesky parameterization ] [ Kalman filter ] [ non-linear filtering ] [ noise estimation ] [ Optimization ]


Abstract:

In non-linear filtering, it is traditional to compare non-linear architectures such as neural networks to the standard linear Kalman Filter (KF). We observe that this methodology mixes the evaluation of two separate components: the non-linear architecture, and the numeric optimization method. In particular, the non-linear model is often optimized, whereas the reference KF model is not. We argue that both should be optimized similarly. We suggest the Optimized KF (OKF), which adjusts numeric optimization to the positive-definite KF parameters. We demonstrate how a significant advantage of a neural network over the KF may entirely vanish once the KF is optimized using OKF. This implies that experimental conclusions of certain previous studies were derived from a flawed process. The benefits of OKF over the non-optimized KF are further studied theoretically and empirically, where OKF demonstrates consistently improved accuracy in a variety of problems.

Chat is not available.