Timezone: »

The Lipschitz Constant of Self-Attention
Hyunjik Kim · George Papamakarios · Andriy Mnih

Wed Jul 21 07:25 AM -- 07:30 AM (PDT) @

Lipschitz constants of neural networks have been explored in various contexts in deep learning, such as provable adversarial robustness, estimating Wasserstein distance, stabilising training of GANs, and formulating invertible neural networks. Such works have focused on bounding the Lipschitz constant of fully connected or convolutional networks, composed of linear maps and pointwise non-linearities. In this paper, we investigate the Lipschitz constant of self-attention, a non-linear neural network module widely used in sequence modelling. We prove that the standard dot-product self-attention is not Lipschitz for unbounded input domain, and propose an alternative L2 self-attention that is Lipschitz. We derive an upper bound on the Lipschitz constant of L2 self-attention and provide empirical evidence for its asymptotic tightness. To demonstrate the practical relevance of our theoretical work, we formulate invertible self-attention and use it in a Transformer-based architecture for a character-level language modelling task.

Author Information

Hyunjik Kim (DeepMind)
George Papamakarios (DeepMind)
Andriy Mnih (DeepMind)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors