Timezone: »

 
Poster
Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks
Juho Lee · Yoonho Lee · Jungtaek Kim · Adam Kosiorek · Seungjin Choi · Yee-Whye Teh

Tue Jun 11 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #23

Many machine learning tasks such as multiple instance learning, 3D shape recognition, and few-shot image classification are defined on sets of instances. Since solutions to such problems do not depend on the order of elements of the set, models used to address them should be permutation invariant. We present an attention-based neural network module, the Set Transformer, specifically designed to model interactions among elements in the input set. The model consists of an encoder and a decoder, both of which rely on attention mechanisms. In an effort to reduce computational complexity, we introduce an attention scheme inspired by inducing point methods from sparse Gaussian process literature. It reduces the computation time of self-attention from quadratic to linear in the number of elements in the set. We show that our model is theoretically attractive and we evaluate it on a range of tasks, demonstrating the state-of-the-art performance compared to recent methods for set-structured data.

Author Information

Juho Lee (AITRICS)
Yoonho Lee (Kakao Corporation)
Jungtaek Kim (POSTECH)
Adam Kosiorek (University of Oxford)

I am a PhD student supervised by Ingmar Posner and Yee Whye Teh. I am interested in machine reasoning, and mostly in efficient inference in deep generative models, especially for timeseries. I am also excited by attention mechanisms and external memory for neural networks. I received an MSc in Computational Science & Engineering from the Technical University of Munich, where I worked on VAEs with Patrick van der Smagt. In my free time I train gymnastics and read lots of books.

Seungjin Choi (POSTECH)
Yee-Whye Teh (Oxford and DeepMind)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors