Poster
in
Workshop: AI for Science: Scaling in AI for Scientific Discovery
Physics-Informed Neural Networks for Derivative-Constrained PDEs
Kentaro Hoshisashi · Carolyn Phelan · Paolo Barucca
Keywords: [ partial differential equations ] [ Machine Learning ] [ Multi-objective Learning ] [ Physics-Informed Neural Networks ] [ Derivative-Constrained ]
Physics-Informed Neural Networks (PINNs) have emerged as a promising approach for solving partial differential equations (PDEs) using deep learning. However, standard PINNs do not address the problem of constrained PDEs, where the solution must satisfy additional equality or inequality constraints beyond the governing equations. In this paper, we introduce Derivative-Constrained PINNs (DC-PINNs), a novel framework that seamlessly incorporates constraint information into the PINNs training process. DC-PINNs employ a constraint-aware loss function that penalizes constraint violations while simultaneously minimizing the PDE residual. Key components include self-adaptive loss balancing techniques that automatically tune the relative weighting of each term, enhancing training stability, and the use of automatic differentiation to efficiently compute exact derivatives. This study demonstrates the effectiveness of DC-PINNs on several benchmark problems related to quantitative finance: heat diffusion, Black-Scholes pricing, and local volatility surface calibration. The results showcase improvements in generating appropriate solutions that satisfy the constraints compared to baseline PINNs methods. The DC-PINNs framework opens up new possibilities for solving constrained PDEs in multi-objective optimization.