Skip to yearly menu bar Skip to main content


Poster

Learning to Optimize Multigrid PDE Solvers

Daniel Greenfeld · Meirav Galun · Ronen Basri · Irad Yavneh · Ron Kimmel

Pacific Ballroom #249

Keywords: [ Other Applications ]


Abstract:

Constructing fast numerical solvers for partial differential equations (PDEs) is crucial for many scientific disciplines. A leading technique for solving large-scale PDEs is using multigrid methods. At the core of a multigrid solver is the prolongation matrix, which relates between different scales of the problem. This matrix is strongly problem-dependent, and its optimal construction is critical to the efficiency of the solver. In practice, however, devising multigrid algorithms for new problems often poses formidable challenges. In this paper we propose a framework for learning multigrid solvers. Our method learns a (single) mapping from discretized PDEs to prolongation operators for a broad class of 2D diffusion problems. We train a neural network once for the entire class of PDEs, using an efficient and unsupervised loss function. Our tests demonstrate improved convergence rates compared to the widely used Black-Box multigrid scheme, suggesting that our method successfully learned rules for constructing prolongation matrices.

Live content is unavailable. Log in and register to view live content