Timezone: »

Learning Efficient and Robust Ordinary Differential Equations via Invertible Neural Networks
Weiming Zhi · Tin Lai · Lionel Ott · Edwin V Bonilla · Fabio Ramos

Thu Jul 21 03:00 PM -- 05:00 PM (PDT) @ Hall E #306

Advances in differentiable numerical integrators have enabled the use of gradient descent techniques to learn ordinary differential equations (ODEs), where a flexible function approximator (often a neural network) is used to estimate the system dynamics, given as a time derivative. However, these integrators can be unsatisfactorily slow and unstable when learning systems of ODEs from long sequences. We propose to learn an ODE of interest from data by viewing its dynamics as a vector field related to another base vector field via a diffeomorphism (i.e., a differentiable bijection), represented by an invertible neural network (INN). By learning both the INN and the dynamics of the base ODE, we provide an avenue to offload some of the complexity in modelling the dynamics directly on to the INN. Consequently, by restricting the base ODE to be amenable to integration, we can speed up and improve the robustness of integrating trajectories from the learned system. We demonstrate the efficacy of our method in training and evaluating benchmark ODE systems, as well as within continuous-depth neural networks models. We show that our approach attains speed-ups of up to two orders of magnitude when integrating learned ODEs.

Author Information

Weiming Zhi (University of Sydney)
Tin Lai (The University of Sydney)
Lionel Ott (ETH)
Edwin V Bonilla (CSIRO's Data61)
Fabio Ramos (NVIDIA, University of Sydney)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors