Skip to yearly menu bar Skip to main content


Set Norm and Equivariant Skip Connections: Putting the Deep in Deep Sets

Lily Zhang · Veronica Tozzo · John Higgins · Rajesh Ranganath

Hall E #524

Keywords: [ DL: Everything Else ]


Permutation invariant neural networks are a promising tool for predictive modeling of set data. We show, however, that existing architectures struggle to perform well when they are deep. In this work, we mathematically and empirically analyze normalization layers and residual connections in the context of deep permutation invariant neural networks. We develop set norm, a normalization tailored for sets, and introduce the ``clean path principle'' for equivariant residual connections alongside a novel benefit of such connections, the reduction of information loss. Based on our analysis, we propose Deep Sets++ and Set Transformer++, deep models that reach comparable or better performance than their original counterparts on a diverse suite of tasks. We additionally introduce Flow-RBC, a new single-cell dataset and real-world application of permutation invariant prediction. We open-source our data and code here:

Chat is not available.