Timezone: »

 
Poster
LyaNet: A Lyapunov Framework for Training Neural ODEs
Ivan Dario Jimenez Rodriguez · Aaron Ames · Yisong Yue

Tue Jul 19 03:30 PM -- 05:30 PM (PDT) @ Hall E #310

We propose a method for training ordinary differential equations by using a control-theoretic Lyapunov condition for stability. Our approach, called LyaNet, is based on a novel Lyapunov loss formulation that encourages the inference dynamics to converge quickly to the correct prediction. Theoretically, we show that minimizing Lyapunov loss guarantees exponential convergence to the correct solution and enables a novel robustness guarantee. We also provide practical algorithms, including one that avoids the cost of backpropagating through a solver or using the adjoint method. Relative to standard Neural ODE training, we empirically find that LyaNet can offer improved prediction performance, faster convergence of inference dynamics, and improved adversarial robustness. Our code is available at https://github.com/ivandariojr/LyapunovLearning.

Author Information

Ivan Dario Jimenez Rodriguez (California Institute of Technology)
Aaron Ames (Caltech)
Yisong Yue (Caltech)
Yisong Yue

Yisong Yue is a Professor of Computing and Mathematical Sciences at Caltech and (via sabbatical) a Principal Scientist at Latitude AI. His research interests span both fundamental and applied pursuits, from novel learning-theoretic frameworks all the way to deep learning deployed in autonomous driving on public roads. His work has been recognized with multiple paper awards and nominations, including in robotics, computer vision, sports analytics, machine learning for health, and information retrieval. At Latitude AI, he is working on machine learning approaches to motion planning for autonomous driving.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors