Timezone: »

A Dynamical System Perspective for Lipschitz Neural Networks
Laurent Meunier · Blaise Delattre · Alexandre ARAUJO · Alexandre Allauzen

Wed Jul 20 10:50 AM -- 11:10 AM (PDT) @ Room 318 - 320
The Lipschitz constant of neural networks has been established as a key quantity to enforce the robustness to adversarial examples. In this paper, we tackle the problem of building $1$-Lipschitz Neural Networks. By studying Residual Networks from a continuous time dynamical system perspective, we provide a generic method to build $1$-Lipschitz Neural Networks and show that some previous approaches are special cases of this framework. Then, we extend this reasoning and show that ResNet flows derived from convex potentials define $1$-Lipschitz transformations, that lead us to define the {\em Convex Potential Layer} (CPL). A comprehensive set of experiments on several datasets demonstrates the scalability of our architecture and the benefits as an $\ell_2$-provable defense against adversarial examples. Our code is available at \url{https://github.com/MILES-PSL/Convex-Potential-Layer}

Author Information

Laurent Meunier (Dauphine University - FAIR Paris)
Blaise Delattre (Université Paris-Dauphine)
Alexandre ARAUJO (INRIA)
Alexandre ARAUJO

I am currently a postdoctoral researcher at INRIA and École Normale Supérieure (ENS) in the WILLOW project-team in Paris, France. I work with Jean Ponce and Julien Mairal (INRIA Grenoble) on Computer Vision and Machine Learning. I obtained my PhD in Computer Science in June 2021 at Université Paris Dauphine-PSL where I was advised by Pr. Jamal Atif, Pr. Yann Chevaleyre and Dr. Benjamin Negrevergne. During my PhD, I have focused on how to leverage the properties of structured matrices to improve the training of neural networks.

Alexandre Allauzen (LAMSADE, Paris-Dauphine University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors