Skip to yearly menu bar Skip to main content


Poster

Learning from Integral Losses in Physics Informed Neural Networks

Ehsan Saleh · Saba Ghaffari · Timothy Bretl · Luke Olson · Matthew West

Hall C 4-9 #1212
[ ] [ Paper PDF ]
[ Poster
Thu 25 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract:

This work proposes a solution for the problem of training physics-informed networks under partial integro-differential equations. These equations require an infinite or a large number of neural evaluations to construct a single residual for training. As a result, accurate evaluation may be impractical, and we show that naive approximations at replacing these integrals with unbiased estimates lead to biased loss functions and solutions. To overcome this bias, we investigate three types of potential solutions: the deterministic sampling approaches, the double-sampling trick, and the delayed target method. We consider three classes of PDEs for benchmarking; one defining Poisson problems with singular charges and weak solutions of up to 10 dimensions, another involving weak solutions on electro-magnetic fields and a Maxwell equation, and a third one defining a Smoluchowski coagulation problem. Our numerical results confirm the existence of the aforementioned bias in practice and also show that our proposed delayed target approach can lead to accurate solutions with comparable quality to ones estimated with a large sample size integral. Our implementation is open-source and available at https://github.com/ehsansaleh/btspinn.

Chat is not available.