Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 22 tok/s Pro
GPT-4o 97 tok/s Pro
Kimi K2 176 tok/s Pro
GPT OSS 120B 432 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Finite Volume Graph Network(FVGN): Predicting unsteady incompressible fluid dynamics with finite volume informed neural network (2309.10050v4)

Published 18 Sep 2023 in physics.flu-dyn and physics.comp-ph

Abstract: The rapid development of deep learning has significant implications for the advancement of Computational Fluid Dynamics (CFD). Currently, most pixel-grid-based deep learning methods for flow field prediction exhibit significantly reduced accuracy in predicting boundary layer flows and poor adaptability to geometric shapes. Although Graph Neural Network (GNN) models for unstructured grids based unsteady flow prediction have better geometric adaptability, these models suffer from error accumulation in long-term predictions of unsteady flows. More importantly, fully data-driven models often require extensive training time, greatly limiting the rapid update and iteration speed of deep learning models when facing more complex unsteady flows. Therefore, this paper aims to balance the demands for training overhead and prediction accuracy by integrating physical constraints based on the finite volume method into the loss function of the graph neural network. Additionally, it incorporates a twice-massage aggregation mechanism inspired by the extended stencil method to enhance the unsteady flow prediction accuracy and geometric shape generalization ability of the graph neural network model on unstructured grids. We focus particularly on the model's predictive accuracy within the boundary layer. Compared to fully data-driven methods, our model achieves better predictive accuracy and geometric shape generalization ability in a shorter training time.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. D. Kochkov, J. A. Smith, A. Alieva, Q. Wang, M. P. Brenner,  and S. Hoyer, “Machine learning–accelerated computational fluid dynamics,” Proceedings of the National Academy of Sciences 118, e2101784118 (2021).
  2. M. Raissi, P. Perdikaris,  and G. E. Karniadakis, “Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations,” Journal of Computational physics 378, 686–707 (2019).
  3. C. Rao, H. Sun,  and Y. Liu, “Physics-informed deep learning for incompressible laminar flows,” Theoretical and Applied Mechanics Letters 10, 207–212 (2020).
  4. X. Jin, S. Cai, H. Li,  and G. E. Karniadakis, “NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations,” Journal of Computational Physics 426, 109951 (2021).
  5. Z. Li, H. Zheng, N. Kovachki, D. Jin, H. Chen, B. Liu, K. Azizzadenesheli,  and A. Anandkumar, “Physics-informed neural operator for learning partial differential equations,” arXiv preprint arXiv:2111.03794  (2021).
  6. N. Wandel, M. Weinmann, M. Neidlin,  and R. Klein, “Spline-PINN: Approaching PDEs without Data using Fast, Physics-Informed Hermite-Spline CNNs,”  (2022), arXiv:2109.07143 [physics].
  7. A. Arzani, J.-X. Wang,  and R. M. D’Souza, “Uncovering near-wall blood flow from sparse data with physics-informed neural networks,” Physics of Fluids 33 (2021).
  8. A. G. Baydin, B. A. Pearlmutter, A. A. Radul,  and J. M. Siskind, “Automatic differentiation in machine learning: a survey,” Journal of Marchine Learning Research 18, 1–43 (2018).
  9. T. Pfaff, M. Fortunato, A. Sanchez-Gonzalez,  and P. W. Battaglia, “Learning mesh-based simulation with graph networks,” arXiv preprint arXiv:2010.03409  (2020).
  10. R. Ranade, C. Hill,  and J. Pathak, “Discretizationnet: A machine-learning based solver for navier–stokes equations using finite volume discretization,” Computer Methods in Applied Mechanics and Engineering 378, 113722 (2021).
  11. N. Wandel, M. Weinmann,  and R. Klein, “Learning incompressible fluid dynamics from scratch–towards fast, differentiable fluid models that generalize,” arXiv preprint arXiv:2006.08762  (2020).
  12. N. Wandel, M. Weinmann,  and R. Klein, “Teaching the Incompressible Navier-Stokes Equations to Fast Neural Surrogate Models in 3D,” Physics of Fluids 33, 047117 (2021), arXiv:2012.11893 [physics].
  13. H. Gao, L. Sun,  and J.-X. Wang, “PhyGeoNet: Physics-informed geometry-adaptive convolutional neural networks for solving parameterized steady-state PDEs on irregular domain,” Journal of Computational Physics 428, 110079 (2021).
  14. H. Gao, M. J. Zahr,  and J.-X. Wang, “Physics-informed graph neural galerkin networks: A unified framework for solving pde-governed forward and inverse problems,” Computer Methods in Applied Mechanics and Engineering 390, 114502 (2022).
  15. R. Gao, I. K. Deo,  and R. K. Jaiman, “A finite element-inspired hypergraph neural network: Application to fluid dynamics simulations,” Available at SSRN 4462715  (2022).
  16. F. Moukalled, L. Mangani,  and M. Darwish, “The finite volume method,” in The finite volume method in computational fluid dynamics (Springer, 2016) pp. 103–135.
  17. T. Praditia, M. Karlbauer, S. Otte, S. Oladyshkin, M. V. Butz,  and W. Nowak, “Finite volume neural network: Modeling subsurface contaminant transport,” arXiv preprint arXiv:2104.06010  (2021).
  18. M. Karlbauer, T. Praditia, S. Otte, S. Oladyshkin, W. Nowak,  and M. V. Butz, “Composing Partial Differential Equations with Physics-Aware Neural Networks,”  (2022), arXiv:2111.11798 [cs].
  19. L.-W. Chen and N. Thuerey, “Towards high-accuracy deep learning inference of compressible flows over aerofoils,” Computers & Fluids 250, 105707 (2023).
  20. S. Brahmachary and N. Thuerey, “Unsteady cylinder wakes from arbitrary bodies with differentiable physics-assisted neural network,” arXiv preprint arXiv:2308.04296  (2023).
  21. J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals,  and G. E. Dahl, “Neural message passing for quantum chemistry,” in International conference on machine learning (PMLR, 2017) pp. 1263–1272.
  22. P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner, et al., “Relational inductive biases, deep learning, and graph networks,” arXiv preprint arXiv:1806.01261  (2018).
  23. J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, L. Wang, C. Li,  and M. Sun, “Graph neural networks: A review of methods and applications,” AI Open 1, 57–81 (2020).
  24. M. Horie and N. Mitsume, “Physics-Embedded Neural Networks: Graph Neural PDE Solvers with Mixed Boundary Conditions,”  (2023), arXiv:2205.11912 [cs].
  25. Z. Li and A. B. Farimani, “Graph neural network-accelerated Lagrangian fluid simulation,” Computers & Graphics 103, 201–211 (2022).
  26. J.-Z. Peng, Y. Hua, Y.-B. Li, Z.-H. Chen, W.-T. Wu,  and N. Aubry, “Physics-informed graph convolutional neural network for modeling fluid flow and heat convection,” Physics of Fluids 35 (2023).
  27. A. Sanchez-Gonzalez, N. Heess, J. T. Springenberg, J. Merel, M. Riedmiller, R. Hadsell,  and P. Battaglia, “Graph networks as learnable physics engines for inference and control,” in International Conference on Machine Learning (PMLR, 2018) pp. 4470–4479.
  28. A. Sanchez-Gonzalez, J. Godwin, T. Pfaff, R. Ying, J. Leskovec,  and P. Battaglia, “Learning to simulate complex physics with graph networks,” in International Conference on Machine Learning (PMLR, 2020) pp. 8459–8468.
  29. S. Seo*, C. Meng*,  and Y. Liu, “Physics-aware difference graph networks for sparsely-observed dynamics,” in International Conference on Learning Representations (2020).
  30. X. Han, H. Gao, T. Pffaf, J.-X. Wang,  and L.-P. Liu, “Predicting physics in mesh-reduced space with temporal attention,” arXiv preprint arXiv:2201.09113  (2022).
  31. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser,  and I. Polosukhin, “Attention is all you need,” Advances in neural information processing systems 30 (2017).
  32. J. Chen, E. Hachem,  and J. Viquerat, “Graph neural networks for laminar flow prediction around random two-dimensional shapes,” Physics of Fluids 33 (2021).
  33. X. He, Y. Wang,  and J. Li, “Flow completion network: Inferring the fluid dynamics from incomplete flow information using graph neural networks,” Physics of Fluids 34 (2022).
  34. J. Brandstetter, D. Worrall,  and M. Welling, “Message passing neural pde solvers,” arXiv preprint arXiv:2202.03376  (2022).
  35. S. Elfwing, E. Uchibe,  and K. Doya, “Sigmoid-weighted linear units for neural network function approximation in reinforcement learning,” Neural networks 107, 3–11 (2018).
  36. D. Chen, Y. Lin, W. Li, P. Li, J. Zhou,  and X. Sun, “Measuring and Relieving the Over-Smoothing Problem for Graph Neural Networks from the Topological View,” Proceedings of the AAAI Conference on Artificial Intelligence 34, 3438–3445 (2020).
  37. Y. Rubanova, A. Sanchez-Gonzalez, T. Pfaff,  and P. Battaglia, “Constraint-based graph network simulator,” arXiv preprint arXiv:2112.09161  (2021).
  38. N. Thuerey, K. Weissenow, L. Prantl,  and X. Hu, “Deep Learning Methods for Reynolds-Averaged Navier-Stokes Simulations of Airfoil Flows,” AIAA Journal 58, 25–36 (2020), arXiv:1810.08217 [physics, stat].
Citations (11)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 1 like.

Upgrade to Pro to view all of the tweets about this paper: