Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Learning, Control, and Dynamical Systems

Embedding Surfaces by Optimizing Neural Networks with Prescribed Riemannian Metric and Beyond

Yi Feng · Sizhe Li · Ioannis Panageas · Xiao Wang


Abstract:

From a machine learning perspective, the problem of solving partial differential equations (PDEs) can be formulated into a least square minimization problem, where neural networks are used to parametrized PDE solutions. Ideally a global minimizer of the square loss corresponds to a solution of the PDE. In this paper we start with a special type of nonlinear PDE arising from differential geometry, the isometric embedding equation, which relates to many long-standing open questions in geometry and analysis. We show that the gradient descent method can identify a global minimizer of the least-square loss function with two-layer neural networks under the assumption of over-parametrization. As a consequence, this solves the surface embedding locally with a prescribed Riemannian metric. We also extend the convergence analysis for gradient descent to higher order linear PDEs with over-parametrization assumption.

Chat is not available.