Skip to yearly menu bar Skip to main content


Spotlight

Provable Lipschitz Certification for Generative Models

Matt Jordan · Alexandros Dimakis

[ ] [ Livestream: Visit Security and Explanability ] [ Paper ]
[ Paper ]

Abstract:

We present a scalable technique for upper bounding the Lipschitz constant of generative models. We relate this quantity to the maximal norm over the set of attainable vector-Jacobian products of a given generative model. We approximate this set by layerwise convex approximations using zonotopes. Our approach generalizes and improves upon prior work using zonotope transformers and we extend to Lipschitz estimation of neural networks with large output dimension. This provides efficient and tight bounds on small networks and can scale to generative models on VAE and DCGAN architectures.

Chat is not available.