Poster
in
Workshop: The Synergy of Scientific and Machine Learning Modelling (SynS & ML) Workshop
Infinite-Fidelity Surrogate Learning via High-order Gaussian Processes
Shibo Li · Li Shi · Shandian Zhe
Keywords: [ Multi-Fidelity Modeling ] [ Multi-Outputs Gaussian Process ] [ Physical Simulations ] [ partial differential equations ]
Multi-fidelity learning is popular in computational physics. While the fidelity is often up to the choice of mesh spacing and hence is continuous in nature, most methods only model finite, discrete fidelities. The recent work (Li et al., 2022) proposes the first continuous-fidelity surrogate model, named infinite-fidelity coregionalization (IFC), which uses a neural Ordinary Differential Equation (ODE) to capture the rich information within the infinite, continuous fidelity space. While showing state-of-the-art predictive performance, IFC is computationally expensive in training and is difficult for uncertainty quantification. To overcome these limitations, we propose Infinite-Fidelity High-Order Gaussian Process (IF-HOGP), based on the recent GP high-dimensional output regression model HOGP. By tensorizing the output and using a product kernel at each mode, HOGP can highly efficiently estimate the mapping from the PDE parameters to the high-dimensional solution output, without the need for any low-rank approximation. We made a simple extension by injecting the continuous fidelity variable into the input, and applying a neural network transformation before feeding the input into the kernel. On three benchmark PDEs, IF-HOGP achieves prediction accuracy better than or close to IFC, yet gains 380x speed-up and 7/8 memory reduction. Meanwhile, uncertainty calibration for IF-HOGP is straightforward.