Identifiable Smooth Conjugacy Learning via Adversarial Orthogonality
Abstract
Data-driven dynamical system models often fail to recover the long-term structure of the underlying system, as their behavior is weakly constrained off the data manifold. Conjugacy-based approaches address this limitation by learning a diffeomorphism that pushes forward a source vector field to match observed dynamics, inheriting qualitative topology from the source. However, such methods typically presuppose that the chosen source system is topologically compatible with the target data. When this assumption is violated, the conjugacy problem becomes ill-posed, and arbitrary corrections can be traded off against diffeomorphic variation, leading to non-identifiability. We propose a framework that relaxes this assumed prior by jointly learning the diffeomorphic conjugacy together with controlled adjustments to the source dynamics via low-dimensional context modulation. Inspired by versal unfolding theory, we enforce the modulation space to be orthogonal to the worst-case orbit-tangent directions, obtained by adversarially searching over a class of parameterized diffeomorphisms. This promotes an identifiable decomposition of dynamical variation into diffeomorphic and intrinsic, topology-changing components, enabling interpretable corrections that recover the canonical structure such as normal forms and symmetries.