Timezone: »
Poster
On the Universality of Invariant Networks
Haggai Maron · Ethan Fetaya · Nimrod Segol · Yaron Lipman
Constraining linear layers in neural networks to respect symmetry transformations from a group $G$ is a common design principle for invariant networks that has found many applications in machine learning.
In this paper, we consider a fundamental question that has received very little attention to date: Can these networks approximate any (continuous) invariant function?
We tackle the rather general case where $G\leq S_n$ (an arbitrary subgroup of the symmetric group) that acts on $\R^n$ by permuting coordinates. This setting includes several recent popular invariant networks. We present two main results: First, $G$-invariant networks are universal if high-order tensors are allowed. Second, there are groups $G$ for which higher-order tensors are unavoidable for obtaining universality.
$G$-invariant networks consisting of only first-order tensors are of special interest due to their practical value. We conclude the paper by proving a necessary condition for the universality of $G$-invariant networks that incorporate only first-order tensors. Lastly, we propose a conjecture stating that this condition is also sufficient.
Author Information
Haggai Maron (Weizmann Institute of Science)
Ethan Fetaya (University of Toronto)
Nimrod Segol (Weizmann Institute of Science)
Yaron Lipman (Weizmann Institute of Science)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Oral: On the Universality of Invariant Networks »
Tue. Jun 11th 09:30 -- 09:35 PM Room Grand Ballroom
More from the Same Authors
-
2023 Oral: Equivariant Polynomials for Graph Neural Networks »
Omri Puny · Derek Lim · Bobak T Kiani · Haggai Maron · Yaron Lipman -
2023 Poster: Equivariant Polynomials for Graph Neural Networks »
Omri Puny · Derek Lim · Bobak T Kiani · Haggai Maron · Yaron Lipman -
2023 Poster: Multisample Flow Matching: Straightening Flows with Minibatch Couplings »
Aram-Alexandre Pooladian · Heli Ben-Hamu · Carles Domingo i Enrich · Brandon Amos · Yaron Lipman · Ricky T. Q. Chen -
2023 Poster: On Kinetic Optimal Probability Paths for Generative Models »
Neta Shaul · Ricky T. Q. Chen · Maximilian Nickel · Matthew Le · Yaron Lipman -
2023 Poster: MultiDiffusion: Fusing Diffusion Paths for Controlled Image Generation »
Omer Bar-Tal · Lior Yariv · Yaron Lipman · Tali Dekel -
2022 Poster: Matching Normalizing Flows and Probability Paths on Manifolds »
Heli Ben-Hamu · samuel cohen · Joey Bose · Brandon Amos · Maximilian Nickel · Aditya Grover · Ricky T. Q. Chen · Yaron Lipman -
2022 Spotlight: Matching Normalizing Flows and Probability Paths on Manifolds »
Heli Ben-Hamu · samuel cohen · Joey Bose · Brandon Amos · Maximilian Nickel · Aditya Grover · Ricky T. Q. Chen · Yaron Lipman -
2021 Poster: Phase Transitions, Distance Functions, and Implicit Neural Representations »
Yaron Lipman -
2021 Spotlight: Phase Transitions, Distance Functions, and Implicit Neural Representations »
Yaron Lipman -
2021 Poster: Riemannian Convex Potential Maps »
samuel cohen · Brandon Amos · Yaron Lipman -
2021 Spotlight: Riemannian Convex Potential Maps »
samuel cohen · Brandon Amos · Yaron Lipman -
2020 Poster: Implicit Geometric Regularization for Learning Shapes »
Amos Gropp · Lior Yariv · Niv Haim · Matan Atzmon · Yaron Lipman -
2019 : Yaron Lipman, Weizmann Institute of Science »
Yaron Lipman -
2018 Poster: Reviving and Improving Recurrent Back-Propagation »
Renjie Liao · Yuwen Xiong · Ethan Fetaya · Lisa Zhang · Kijung Yoon · Zachary S Pitkow · Raquel Urtasun · Richard Zemel -
2018 Oral: Reviving and Improving Recurrent Back-Propagation »
Renjie Liao · Yuwen Xiong · Ethan Fetaya · Lisa Zhang · Kijung Yoon · Zachary S Pitkow · Raquel Urtasun · Richard Zemel -
2018 Poster: Neural Relational Inference for Interacting Systems »
Thomas Kipf · Ethan Fetaya · Kuan-Chieh Wang · Max Welling · Richard Zemel -
2018 Oral: Neural Relational Inference for Interacting Systems »
Thomas Kipf · Ethan Fetaya · Kuan-Chieh Wang · Max Welling · Richard Zemel