Skip to yearly menu bar Skip to main content


Poster

$\bf{\Phi}_\textrm{Flow}$: Differentiable Simulations for PyTorch, TensorFlow and Jax

Philipp Holl · Nils Thuerey

Hall C 4-9 #200
[ ] [ Paper PDF ]
[ Poster
Wed 24 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract: Differentiable processes have proven an invaluable tool for machine learning (ML) in scientific and engineering settings, but most ML libraries are not primarily designed for such applications. We present $\Phi_\textrm{Flow}$, a Python toolkit that seamlessly integrates with PyTorch, TensorFlow, Jax and NumPy, simplifying the process of writing differentiable simulation code at every step. $\Phi_\textrm{Flow}$ provides many essential features that go beyond the capabilities of the base libraries, such as differential operators, boundary conditions, the ability to write dimensionality-agnostic code, floating-point precision management, fully differentiable preconditioned (sparse) linear solves, automatic matrix generation via function tracing, integration of SciPy optimizers, simulation vectorization, and visualization tools. At the same time, $\Phi_\textrm{Flow}$ inherits all important traits of the base ML libraries, such as GPU / TPU support, just-in-time compilation, and automatic differentiation. Put together, these features drastically simplify scientific code like PDE or ODE solvers on grids or unstructured meshes, and $\Phi_\textrm{Flow}$ even includes out-of-the-box support for fluid simulations. $\Phi_\textrm{Flow}$ has been used in various publications and as a ground-truth solver in multiple scientific data sets.

Chat is not available.