Learning to Optimize Multigrid PDE Solvers
Daniel Greenfeld · Meirav Galun · Ronen Basri · Irad Yavneh · Ron Kimmel

Tue Jun 11th 12:15 -- 12:20 PM @ Room 201

Constructing fast numerical solvers for partial differential equations (PDEs) is crucial for many scientific disciplines. A leading technique for solving large-scale PDEs is using multigrid methods. At the core of a multigrid solver is the prolongation matrix, which relates between different scales of the problem. This matrix is strongly problem-dependent, and its optimal construction is critical to the efficiency of the solver. In practice, however, devising multigrid algorithms for new problems often poses formidable challenges. In this paper we propose a framework for learning multigrid solvers. Our method learns a (single) mapping from discretized PDEs to prolongation operators for a broad class of 2D diffusion problems. We train a neural network once for the entire class of PDEs, using an efficient and unsupervised loss function. Our tests demonstrate improved convergence rates compared to the widely used Black-Box multigrid scheme, suggesting that our method successfully learned rules for constructing prolongation matrices.

Author Information

Daniel Greenfeld (Weizmann Institute of Science)
Meirav Galun (Weizmann Institute of Science)
Ronen Basri (Weizmann Institute of Science)
Irad Yavneh (Technion)
Ron Kimmel (Technion)

Related Events (a corresponding poster, oral, or spotlight)