Lie-Algebraic Neural Koopman Dynamics
Abstract
We present a Lie-algebraic approach to model Koopman dynamics that integrates algebraic structure with computational scalability. The proposed formulation constrains the neural generators to evolve within prescribed Lie subalgebras and constructs finite-time flows through a neural Magnus expansion construction, thereby maintaining consistency with the associated Lie-group composition over each time segment. To address the computational burden inherent in sequential propagation, we exploit the associativity of Lie-group compositions and construct segmentwise propagators via a prefix-scan algorithm, which reduces the depth of temporal composition from linear to logarithmic. Consequently, the framework enables accurate long-horizon prediction while improving computational efficiency, and provides a principled foundation for scalable Koopman operator learning for nonlinear systems.