Timezone: »

 
Poster
On the Universality of Invariant Networks
Haggai Maron · Ethan Fetaya · Nimrod Segol · Yaron Lipman

Tue Jun 11 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #74
Constraining linear layers in neural networks to respect symmetry transformations from a group $G$ is a common design principle for invariant networks that has found many applications in machine learning. In this paper, we consider a fundamental question that has received very little attention to date: Can these networks approximate any (continuous) invariant function? We tackle the rather general case where $G\leq S_n$ (an arbitrary subgroup of the symmetric group) that acts on $\R^n$ by permuting coordinates. This setting includes several recent popular invariant networks. We present two main results: First, $G$-invariant networks are universal if high-order tensors are allowed. Second, there are groups $G$ for which higher-order tensors are unavoidable for obtaining universality. $G$-invariant networks consisting of only first-order tensors are of special interest due to their practical value. We conclude the paper by proving a necessary condition for the universality of $G$-invariant networks that incorporate only first-order tensors. Lastly, we propose a conjecture stating that this condition is also sufficient.

Author Information

Haggai Maron (Weizmann Institute of Science)
Ethan Fetaya (University of Toronto)
Nimrod Segol (Weizmann Institute of Science)
Yaron Lipman (Weizmann Institute of Science)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors