Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

JAX-SPH: A Differentiable Smoothed Particle Hydrodynamics Framework (2403.04750v2)

Published 7 Mar 2024 in physics.flu-dyn and cs.LG

Abstract: Particle-based fluid simulations have emerged as a powerful tool for solving the Navier-Stokes equations, especially in cases that include intricate physics and free surfaces. The recent addition of machine learning methods to the toolbox for solving such problems is pushing the boundary of the quality vs. speed tradeoff of such numerical simulations. In this work, we lead the way to Lagrangian fluid simulators compatible with deep learning frameworks, and propose JAX-SPH - a Smoothed Particle Hydrodynamics (SPH) framework implemented in JAX. JAX-SPH builds on the code for dataset generation from the LagrangeBench project (Toshev et al., 2023) and extends this code in multiple ways: (a) integration of further key SPH algorithms, (b) restructuring the code toward a Python package, (c) verification of the gradients through the solver, and (d) demonstration of the utility of the gradients for solving inverse problems as well as a Solver-in-the-Loop application. Our code is available at https://github.com/tumaer/jax-sph.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (49)
  1. TensorFlow, Large-scale machine learning on heterogeneous systems, November 2015.
  2. A generalized wall boundary condition for smoothed particle hydrodynamics. Journal of Computational Physics, 231(21):7057–7075, 2012.
  3. A transport-velocity formulation for smoothed particle hydrodynamics. Journal of Computational Physics, 241:292–307, 2013.
  4. Universal physics transformers, 2024.
  5. Inverse design for fluid-structure interactions using graph network simulators, 2022.
  6. Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261, 2018.
  7. Jax-fluids: A fully-differentiable high-order computational fluid dynamics solver for compressible two-phase flows. Computer Physics Communications, pp.  108527, 2022. ISSN 0010-4655. doi: https://doi.org/10.1016/j.cpc.2022.108527.
  8. Small-scale structure of the taylor–green vortex. Journal of Fluid Mechanics, 130:411–452, 1983.
  9. The taylor-green vortex and fully developed turbulence. Journal of Statistical Physics, 34(5-6):1049–1063, 1984.
  10. JAX: composable transformations of Python+NumPy programs, 2018.
  11. Machine Learning for Partial Differential Equations. arXiv preprint arXiv:2303.17078, March 2023.
  12. Paul W Cleary. Modelling confined multi-material heat and mass flows using sph. Applied Mathematical Modelling, 22(12):981–993, 1998.
  13. Numerical simulation of interfacial flows by smoothed particle hydrodynamics. Journal of computational physics, 191(2):448–475, 2003.
  14. Dualsphysics: Open-source parallel cfd solver based on smoothed particle hydrodynamics (sph). Computer Physics Communications, 187:204–216, 2015.
  15. Reverse poiseuille flow: the numerical viscometer. In Aip conference proceedings, volume 1027, pp.  1432–1434. American Institute of Physics, 2008.
  16. High-re solutions for incompressible flow using the navier-stokes equations and a multigrid method. Journal of computational physics, 48(3):387–411, 1982.
  17. Evaluating derivatives: principles and techniques of algorithmic differentiation. SIAM, 2008.
  18. Learning to control pdes with differentiable physics. In International Conference on Learning Representations, 2020.
  19. Physics-informed neural networks for inverse problems in supersonic flows. Journal of Computational Physics, 466:111402, 2022.
  20. Composing partial differential equations with physics-aware neural networks. In International Conference on Machine Learning, pp.  10773–10801. PMLR, 2022.
  21. Patrick Kidger. On neural differential equations, 2022.
  22. Machine learning–accelerated computational fluid dynamics. Proceedings of the National Academy of Sciences, 118(21):e2101784118, 2021.
  23. Smoothed Particle Hydrodynamics Techniques for the Physics Based Simulation of Fluids and Solids. In Wenzel Jakob and Enrico Puppo (eds.), Eurographics 2019 - Tutorials. The Eurographics Association, 2019. doi: 10.2312/egt.20191035.
  24. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998. doi: 10.1109/5.726791.
  25. Graph neural network-accelerated lagrangian fluid simulation. Computers & Graphics, 103:201–211, 2022.
  26. Learning the dynamics of physical systems from sparse observations with finite element networks. In International Conference on Learning Representations (ICLR), 2022.
  27. jusph: A julia-based open-source package of parallel smoothed particle hydrodynamics (sph) for dam break problems. SoftwareX, 19:101151, 2022.
  28. A comparison of automatic differentiation and continuous sensitivity analysis for derivatives of differential equation solutions. In 2021 IEEE High Performance Extreme Computing Conference (HPEC), pp.  1–9. IEEE, 2021.
  29. Joe J Monaghan. Simulating free surface flows with sph. Journal of computational physics, 110(2):399–406, 1994.
  30. Joe J Monaghan. Smoothed particle hydrodynamics. Reports on progress in physics, 68(8):1703, 2005.
  31. Modeling low reynolds number incompressible flows using sph. Journal of computational physics, 136(1):214–226, 1997.
  32. A comparison of the continuous and discrete adjoint approach to automatic aerodynamic optimization. In 38th Aerospace sciences meeting and exhibit, pp.  667, 2000.
  33. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alché Buc, E. Fox, and R. Garnett (eds.), Advances in Neural Information Processing Systems 32, pp.  8024–8035. Curran Associates, Inc., 2019.
  34. Daniel J Price. Smoothed particle hydrodynamics and magnetohydrodynamics. Journal of Computational Physics, 231(3):759–794, 2012.
  35. Pysph: A python-based framework for smoothed particle hydrodynamics. ACM Trans. Math. Softw., 47(4), sep 2021. ISSN 0098-3500. doi: 10.1145/3460773.
  36. Learning to simulate complex physics with graph networks. In International Conference on Machine Learning, 2020.
  37. The graph neural network model. IEEE transactions on neural networks, 20(1):61–80, 2008.
  38. Spnets: Differentiable fluid dynamics for deep neural networks, 2018.
  39. Jax m.d. a framework for differentiable physics. In Advances in Neural Information Processing Systems, volume 33. Curran Associates, Inc., 2020.
  40. Physics-based Deep Learning. arXiv preprint arXiv:2109.05237, 2021.
  41. Learning lagrangian fluid mechanics with E⁢(3)𝐸3\text{$E$}(3)italic_E ( 3 )-equivariant graph neural networks. In Frank Nielsen and Frédéric Barbaresco (eds.), Geometric Science of Information, pp.  332–341, Cham, 2023a. Springer Nature Switzerland. ISBN 978-3-031-38299-4.
  42. Lagrangebench: A lagrangian fluid mechanics benchmarking suite, 2023b.
  43. On the relationships between graph neural networks for the simulation of physical systems and classical numerical methods, 2023c.
  44. Neural sph: Improved neural modeling of lagrangian fluid dynamics, 2024.
  45. Solver-in-the-Loop: Learning from Differentiable Physics to Interact with Iterative PDE-Solvers. Advances in Neural Information Processing Systems, 2020.
  46. Holger Wendland. Piecewise polynomial, positive definite and compactly supported radial functions of minimal degree. Advances in computational Mathematics, 4:389–396, 1995.
  47. A generalized transport-velocity formulation for smoothed particle hydrodynamics. Journal of Computational Physics, 337:216–232, 2017a.
  48. A weakly compressible sph method based on a low-dissipation riemann solver. Journal of Computational Physics, 335:605–620, 2017b.
  49. Sphinxsys: An open-source multi-physics and multi-resolution library based on smoothed particle hydrodynamics. Computer Physics Communications, 267:108066, 2021.
Citations (5)

Summary

  • The paper introduces JAX-SPH, a framework that integrates SPH simulations with JAX to compute gradients for inverse problems and solver-in-the-loop applications.
  • The paper validates gradient reliability by comparing analytical and numerical schemes in simulations such as the Taylor-Green vortex and lid-driven cavity.
  • The paper demonstrates enhanced accuracy and efficiency in CFD by incorporating advanced SPH algorithms and neural correction models.

Overview of JAX-SPH: A Step Towards Integrating SPH Simulations with Machine Learning

JAX-SPH: A Differentiable Smoothed Particle Hydrodynamics Framework

In the field of computational fluid dynamics (CFD), Smoothed Particle Hydrodynamics (SPH) has secured a strong foothold due to its adeptness at capturing complex fluid behaviors, especially in applications dealing with free surfaces. The recent confluence of ML with traditional numerical simulation techniques has unlocked new prospects in terms of accuracy and computational efficiency. Within this burgeoning field, the research work encapsulated in "JAX-SPH: A Differentiable Smoothed Particle Hydrodynamics Framework" marks a significant advancement.

The Driving Force Behind JAX-SPH

At the core of this paper lies the development of JAX-SPH, a framework designed to bridge the SPH simulation approach with the computational efficiency and gradient computation capabilities of JAX, a high-performance numerical computing library. This framework notably extends the LagrangeBench project's dataset generation code by incorporating critical SPH algorithms, transitioning towards a more library-centric structure, validating gradient computations, and demonstrating the practical utility of these gradients in solving inverse problems and facilitating a Solver-in-the-Loop (SitL) application.

Comprehensive Features and Validation

JAX-SPH introduces several noteworthy features, including the integration of Transport Velocity, Riemann SPH algorithms, and thermal diffusion effects, together with a methodical validation of the solver's accuracy and the fidelity of automatically derived gradients. Through rigorous gradient validation using analytical and numerical schemes, the paper establishes the reliability of JAX-SPH's gradient computations across several simulation scenarios, including the Taylor-Green vortex and the lid-driven cavity.

Practical Applications Demonstrated

Beyond the technical enhancements, this research underscores the applicability of JAX-SPH in addressing inverse problems and in implementing Solver-in-the-Loop methodologies. The inverse problem demonstration, where the initial conditions of a fluid simulation are deduced given the final state, serves as a compelling proof-of-concept for the potential of differentiable solvers in design and control challenges. Furthermore, the committed exploration into SitL showcases JAX-SPH's versatility, where it synergizes with a neural correction model to refine a coarsely simulated fluid dynamics problem.

Future Directions and Potentials

While this work pioneers in marrying SPH simulations with machine learning efficiencies through JAX-SPH, it also opens up avenues for more nuanced integration of SPH with ML workflows, including the potential for hybrid solver architectures. Anticipated future endeavors include expanding the cadre of SPH algorithms incorporated within JAX-SPH, refining gradient computation methodologies, and exploring novel applications in PDE-constrained optimization and physics-informed machine learning models for fluid dynamics.

Concluding Thoughts

In summary, "JAX-SPH: A Differentiable Smoothed Particle Hydrodynamics Framework" represents a substantial stride towards harmonizing traditional fluid simulation approaches with the computational prowess of modern machine learning libraries. By validating its utility in real-world problem scenarios and laying down a robust foundation for future exploration, this work not only enriches the toolbox available to researchers in computational fluid dynamics and applied mathematics but also signals the burgeoning potential of hybrid numerical-ML solvers in advancing the frontiers of scientific computing.