Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Geometry-grounded Representation Learning and Generative Modeling

Permutation Tree Invariant Neural Architectures

Johannes Urban · Sebastian Tschiatschek · Nils M. Kriege

Keywords: [ permutation tree; permutation invariance ]


Abstract:

Exploiting symmetry as an inductive bias has become a fundamental technique in deep learning to improve generalization and sample efficiency. We investigate the design of models that are invariant to subgroups of the symmetric group defined by hierarchical structures. We propose permutation trees, which represent permutations by the ordering of their leaves and allow the reordering of siblings depending on the type of their parent, generalizing PQ-trees. We characterize the permutation trees that represent permutation groups and derive invariant neural architectures from them in a bottom-up fashion. We show that our approach learns faster with less data and achieves an improved prediction performance on a synthetic dataset.

Chat is not available.