Timezone: »

 
Faster Training of Neural ODEs Using Gauß–Legendre Quadrature
Alexander Norcliffe · Marc Deisenroth

Neural ODEs demonstrate strong performance in generative and time-series modelling. However, training them via the adjoint method is slow compared to discrete models due to the requirement of numerically solving ODEs. To speed neural ODEs up, a common approach is to regularise the solutions. However, this approach may affect the expressivity of the model; when the trajectory itself matters, this is particularly important. In this paper, we propose an alternative way to speed up the training of neural ODEs. The key idea is to speed up the adjoint method by using Gauß–Legendre quadrature to solve integrals faster than ODE-based methods while remaining memory efficient. Our approach leads to faster training of neural ODEs, especially for large models.

Author Information

Alexander Norcliffe (University of Cambridge)
Marc Deisenroth (University College London)

More from the Same Authors