Skip to yearly menu bar Skip to main content


Learning Symmetric Embeddings for Equivariant World Models

Jung Yeon Park · Ondrej Biza · Linfeng Zhao · Jan-Willem van de Meent · Robin Walters

Hall E #534

Keywords: [ DL: Sequential Models, Time series ] [ DL: Graph Neural Networks ] [ DL: Other Representation Learning ]


Incorporating symmetries can lead to highly data-efficient and generalizable models by defining equivalence classes of data samples related by transformations. However, characterizing how transformations act on input data is often difficult, limiting the applicability of equivariant models. We propose learning symmetric embedding networks (SENs) that encode an input space (e.g. images), where we do not know the effect of transformations (e.g. rotations), to a feature space that transforms in a known manner under these operations. This network can be trained end-to-end with an equivariant task network to learn an explicitly symmetric representation. We validate this approach in the context of equivariant transition models with 3 distinct forms of symmetry. Our experiments demonstrate that SENs facilitate the application of equivariant networks to data with complex symmetry representations. Moreover, doing so can yield improvements in accuracy and generalization relative to both fully-equivariant and non-equivariant baselines.

Chat is not available.