Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators

$\bf{\Phi}_\textrm{Flow}$: Differentiable Simulations for Machine Learning

Philipp Holl · Nils Thuerey

Keywords: [ differentiable physics ] [ TensorFlow ] [ NumPy ] [ Simulation ] [ PyTorch ] [ SciPy ] [ Machine Learning ] [ JAX ]


Abstract: We present $\Phi_\textrm{Flow}$, a Python toolkit that seamlessly integrates with PyTorch, TensorFlow, Jax and NumPy, simplifying the process of writing differentiable simulation code at every step.$\Phi_\textrm{Flow}$ provides many essential features that go beyond the capabilities of the base ML libraries, such as differential operators, boundary conditions, the ability to write dimensionality-agnostic code, floating-point precision management, fully differentiable preconditioned (sparse) linear solves, automatic matrix generation via function tracing, integration of SciPy optimizers, simulation vectorization, and visualization tools.At the same time, $\Phi_\textrm{Flow}$ inherits all important traits of the base ML libraries, such as GPU / TPU support, just-in-time compilation, and automatic differentiation. Put together, these features drastically simplify scientific code like PDE or ODE solvers on grids or unstructured meshes, and $\Phi_\textrm{Flow}$ even includes out-of-the-box support for fluid simulations.$\Phi_\textrm{Flow}$ is available at https://github.com/tum-pbs/PhiFlow.

Chat is not available.