• Stars
    star
    145
  • Rank 254,144 (Top 6 %)
  • Language
    Python
  • License
    MIT License
  • Created over 4 years ago
  • Updated about 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Solver-in-the-Loop: Learning from Differentiable Physics to Interact with Iterative PDE-Solvers

Solver-in-the-Loop

This is the source code repository for the NeurIPS'20 paper "Solver-in-the-Loop: Learning from Differentiable Physics to Interact with Iterative PDE-Solvers" by Kiwon Um, Robert Brand, Yun (Raymond) Fei, Philipp Holl, Nils Thuerey.

Additional information: project page, NeurIPS 2020 page.

3D Unsteady Wake Flow: Source vs Learned Correction vs Reference

Abstract:

Finding accurate solutions to partial differential equations (PDEs) is a crucial task in all scientific and engineering disciplines. It has recently been shown that machine learning methods can improve the solution accuracy by correcting for effects not captured by the discretized PDE. We target the problem of reducing numerical errors of iterative PDE solvers and compare different learning approaches for finding complex correction functions. We find that previously used learning approaches are significantly outperformed by methods that integrate the solver into the training loop and thereby allow the model to interact with the PDE during training. This provides the model with realistic input distributions that take previous corrections into account, yielding improvements in accuracy with stable rollouts of several hundred recurrent evaluation steps and surpassing even tailored supervised variants. We highlight the performance of the differentiable physics networks for a wide variety of PDEs, from non-linear advection-diffusion systems to three-dimensional Navier-Stokes flows.

TUM, Telecom Paris

Tutorial

Requirements

  • TensorFlow; tested with 1.15, with 2.4 for *-tf2, and with 2.4.1 for *-phi2
  • PhiFlow; master branch tested with commit-4f5e678, with commit-3ecdb62 for *-tf2, and with commit-1d292de for *-phi2

We recommend installing via pip, e.g., with pip install tensorflow-gpu==1.15 phiflow=1.5.1.

Running the Code

A makefile is included in a folder of each scenario, and a set of targets is provided. Running the targets one after another will generate training and test data, train a model, and apply it to the test cases.

For now, the unsteady wake flow (karman-2d/) and forced advection-diffusion (burgers/) scenarios in two dimensions are included in this repository. (The others will follow later on.)

Unsteady Wake Flow in 2D

For karman-2d, you can first generate data sets, for training and testing respectively, by running:

make karman-fdt-hires-set      # Generate traning data set
make karman-fdt-hires-testset  # Generate test data set

This will create 1000 time steps of training data for six reynolds numbers, and test data sets for five different reynolds numbers. All are computed in reference space, with a higher resolution. An example is shown below.

Unsteady Wake Flow in 2D, training data

The following command uses the training data to train a SOL-32 model, i.e., one that's trained with 32 steps of differentiable physics in each ADAM iteration.

make karman-fdt-sol32          # Train a model

Here the "magic" happens following line 397 of karman_train.py in the loop over msteps, which controls the number of time steps computed in each iteration. The call to simulator_lo.step() constructs a tensorflow graph for the modified incompressible fluid solver from phiflow. Each sess.run() call in line 500 executes the full graph for a mini-batch, and back-propagates the error via the differentiable phiflow operators.

Once the model is trained, you can apply it to the test data via:

make karman-fdt-sol32/run_test # Run test

This generates sequences of .npz files for velocity and a transported marker density in karman-fdt-sol32/run_test/sim_00000n, which you can visualize with your favourite numpy tools. Below you can find an example output, showing unmodified source at the top, the corrected SOL-32 version in the middle, and the reference at the bottom.

Unsteady Wake Flow in 2D, test result

The NON model discussed in our paper can be trained via the target karman-fdt-non. The makefile additionally contains -pre targets to compute the PRE training data mentioned in the paper, and train a corresponding model.

Forced Advection-Diffusion in 2D

Similar to the unsteady wake flow scenario, you can use the Makefile targets.

Closing Remarks

If you find the approach useful, please cite our paper via:

@article{um2020sol,
  title="{Solver-in-the-Loop: Learning from Differentiable Physics to Interact with Iterative PDE-Solvers}",
  author={Um, Kiwon and Brand, Robert and Fei, Yun and Holl, Philipp and Thuerey, Nils},
  journal={Advances in Neural Information Processing Systems},
  year={2020}
}

This work is supported by the ERC Starting Grant realFlow (StG-2015-637014).

Feel free to contact us if you have questions or suggestions.

Here's another last example of a 3D wake flow generated by a hybrid NN-powered SoL solver trained with the methodology above: Main paper teaser

More Repositories

1

PhiFlow

A differentiable PDE solving framework for machine learning
Python
1,468
star
2

pbdl-book

Welcome to the Physics-based Deep Learning Book (v0.2)
Jupyter Notebook
992
star
3

PhiML

Intuitive scientific computing with dimension types for Jax, PyTorch, TensorFlow & NumPy
Python
68
star
4

autoreg-pde-diffusion

Benchmarking Autoregressive Conditional Diffusion Models for Turbulent Flow Simulation
Jupyter Notebook
68
star
5

DMCF

Guaranteed Conservation of Momentum for Learning Particle-based Fluid Dynamics (NeurIPS '22)
Python
49
star
6

Diffusion-based-Flow-Prediction

Official implementation of the AIAA Journal paper "Uncertainty-aware Surrogate Models for Airfoil Flow Simulations with Denoising Diffusion Probabilistic Models"
Jupyter Notebook
43
star
7

differentiable-piso

Code repository for "Learned Turbulence Modelling with Differentiable Fluid Solvers"
JavaScript
35
star
8

dl-surrogates

C++
31
star
9

LSIM

LSiM is a learned metric to compute distance values for 2D data from numerical simulations
Python
27
star
10

coord-trans-encoding

This is the source code for our paper "Towards high-accuracy deep learning inference of compressible turbulent flows over aerofoils"
Python
26
star
11

CG-Solver-in-the-Loop

Conjugate Gradient related code for "Solver-in-the-Loop: Learning from Differentiable Physics to Interact with Iterative PDE-Solvers"
Python
24
star
12

ConFIG

Official implementation of Conflict-Free Inverse Gradients Method
Python
24
star
13

SMDP

Solving Inverse Physics Problems with Score Matching
Jupyter Notebook
21
star
14

Global-Flow-Transport

Repository for our CVPR 2021 Global Flow Transport Paper
C++
19
star
15

half-inverse-gradients

Source code for the ICLR'22 paper on "Half-Inverse Gradients"
Python
17
star
16

VOLSIM

VolSiM, a CNN-based metric to compute the similarity of 3D data from numerical simulations
Python
15
star
17

SIP

Scale-invariant Learning by Physics Inversion (NeurIPS 2022)
Python
10
star
18

DiffPhys-CylinderWakeFlow

Jupyter Notebook
9
star
19

StableBPTT

The source code the for the ICLR'24 paper "Stabilizing Backpropagation Through Time to Learn Complex Physics"
Python
9
star
20

unrolling

Python
6
star
21

reconstructScalarFlows

ScalarFlow Reconstruction for Large-Scale Volumetric Data Sets of Real-world Scalar Transport Flows
5
star
22

SFBC

ICLR'24: Symmetric basis convolutions for learning lagrangian fluid mechanics
Jupyter Notebook
5
star
23

racecar

Data-driven Regularization via Racecar Training for Generalizing Neural Networks
Python
3
star
24

two-way-coupled-control

Python
2
star
25

Neural-Global-Transport

Repository for the ICLR '23 paper on Neural-Global-Transport
C++
2
star
26

Hybrid-Solver-for-Reactive-Flows

Python
1
star