Poster
in
Workshop: AI for Science: Scaling in AI for Scientific Discovery
Towards Enforcing Hard Physics Constraints in Operator Learning Frameworks
Valentin Duruisseaux · Miguel Liu-Schiaffini · Julius Berner · Anima Anandkumar
Keywords: [ Physics-informed machine learning ] [ physics constraints ] [ operator learning ]
Enforcing physics constraints in surrogate models for PDE evolution operators can improve the physics plausibility of their predictions and overall their convergence and generalization properties. However, imposing these constraints softly as training loss terms can suffer from various challenges and does not guarantee faithfulness to the physics constraints at inference time, calling for stronger ways to impose the constraints. In this paper, we introduce a new approach for enforcing hard physics constraints in operator learning frameworks. We propose to project the output of any operator surrogate model onto the space of functions satisfying a specified constraint, and to perform this projection in a suitable transformed space. Compared to prior works, our method is efficient, compatible with any existing operator learning architecture (both during or after training), and ensures that the physics constraint holds at all points in the spatiotemporal domain. In particular, our approach works remarkably well for linear differential constraints, where the projection can be performed very efficiently in Fourier space. We demonstrate the effectiveness of our approach by enforcing the divergence-free condition of the incompressible Navier-Stokes equations, where our projection operator enforces the constraint without sacrificing faithfulness to the data, and does it at a negligible additional cost.