Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CFDBench: A Large-Scale Benchmark for Machine Learning Methods in Fluid Dynamics (2310.05963v2)

Published 13 Sep 2023 in cs.LG, physics.comp-ph, and physics.flu-dyn

Abstract: In recent years, applying deep learning to solve physics problems has attracted much attention. Data-driven deep learning methods produce fast numerical operators that can learn approximate solutions to the whole system of partial differential equations (i.e., surrogate modeling). Although these neural networks may have lower accuracy than traditional numerical methods, they, once trained, are orders of magnitude faster at inference. Hence, one crucial feature is that these operators can generalize to unseen PDE parameters without expensive re-training.In this paper, we construct CFDBench, a benchmark tailored for evaluating the generalization ability of neural operators after training in computational fluid dynamics (CFD) problems. It features four classic CFD problems: lid-driven cavity flow, laminar boundary layer flow in circular tubes, dam flows through the steps, and periodic Karman vortex street. The data contains a total of 302K frames of velocity and pressure fields, involving 739 cases with different operating condition parameters, generated with numerical methods. We evaluate the effectiveness of popular neural operators including feed-forward networks, DeepONet, FNO, U-Net, etc. on CFDBnech by predicting flows with non-periodic boundary conditions, fluid properties, and flow domain shapes that are not seen during training. Appropriate modifications were made to apply popular deep neural networks to CFDBench and enable the accommodation of more changing inputs. Empirical results on CFDBench show many baseline models have errors as high as 300% in some problems, and severe error accumulation when performing autoregressive inference. CFDBench facilitates a more comprehensive comparison between different neural operators for CFD compared to existing benchmarks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. Lodhi Devendra K. Agrawal. Dam-break flood simulation under various likely scenarios and mapping using gis: Case of a proposed dam on river yamuna, india. Journal of Mountain Science, 2012.
  2. Global stability of a lid-driven cavity with throughflow: Flow visualization studies. Physics of Fluids A, 3(3):2081–2091, 1991.
  3. Accurate medium-range global weather forecasting with 3d neural networks. Nature, 619(7970):533–538, 2023.
  4. Language models are few-shot learners. ArXiv, abs/2005.14165, 2020.
  5. Lno: Laplace neural operator for solving differential equations. arXiv preprint arXiv:2303.10528, 2023.
  6. Lattice boltzmann method for fluid flows. Annual Review of Fluid Mechanics, 1998.
  7. Can-pinn: A fast physics-informed neural network based on coupled-automatic-numerical differentiation method. 2021.
  8. Bert: Pre-training of deep bidirectional transformers for language understanding. ArXiv, abs/1810.04805, 2019.
  9. Sigmoid-weighted linear units for neural network function approximation in reinforcement learning. 2017.
  10. Theory and practice of finite elements, volume 159. Springer, 2004.
  11. A spectral element method for the navier-stokes equations with improved accuracy. SIAM Journal on Numerical Analysis, 2000.
  12. Kunihiko Fukushima. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biological Cybernetics, 36:193–202, 1980.
  13. Phygeonet: Physics-informed geometry-adaptive convolutional neural networks for solving parameterized steady-state pdes on irregular domain. Journal of Computational Physics, page 110079, Mar 2021.
  14. High-re solutions for incompressible flow using the navier-stokes equations and a multigrid method. Journal of Computational Physics, 48(3):387–411, 1982.
  15. Generative adversarial nets. In NIPS, 2014.
  16. Patrik Simon Hadorn. Shift-deeponet: Extending deep operator networks for discontinuous output functions, 2022.
  17. Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. 2015 IEEE International Conference on Computer Vision (ICCV), pages 1026–1034, 2015.
  18. Deep residual learning for image recognition. IEEE, 2016.
  19. Long short-term memory. Neural Computation, 9:1735–1780, 1997.
  20. Augmented physics-informed neural networks (apinns): A gating network-based soft domain decomposition methodology. arXiv preprint arXiv:2211.08939, 2022.
  21. Image-to-image translation with conditional adversarial networks. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jul 2017.
  22. Extended physics-informed neural networks (xpinns): A generalized space-time domain decomposition based deep learning framework for nonlinear partial differential equations. In AAAI spring symposium: MLPS, volume 10, 2021.
  23. Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems. Computer Methods in Applied Mechanics and Engineering, 365:113028, 2020.
  24. Machine learning accelerated computational fluid dynamics. Jan 2021.
  25. Neural operator: Learning maps between function spaces. CoRR, abs/2108.08481, 2021.
  26. Deep learning. Nature, 521:436–44, 05 2015.
  27. Fourier neural operator for parametric partial differential equations, 2020.
  28. Physics-informed neural operator for learning partial differential equations. arXiv preprint arXiv:2111.03794, 2021.
  29. B-deeponet: An enhanced bayesian deeponet for solving noisy parametric pdes using accelerated replica exchange sgld. Journal of Computational Physics, 2023.
  30. Learning nonlinear operators via deeponet based on the universal approximation theorem of operators. Nature Machine Intelligence, 3:218 – 229, 2019.
  31. A comprehensive and fair comparison of two neural operators (with practical extensions) based on fair data. Computer Methods in Applied Mechanics and Engineering, 2021.
  32. Geometry aware physics informed neural network surrogate for solving navier–stokes equation (gapinn). Advanced Modeling and Simulation in Engineering Sciences, 9(1), Jun 2022.
  33. Vladimir Moya Quiroga. 2d dam break flood simulation with hec-ras: Chepete dam. 2021.
  34. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys., 378:686–707, 2019.
  35. Weatherbench: A benchmark data set for data‐driven weather forecasting. Journal of Advances in Modeling Earth Systems, 12(11), 2020.
  36. Phycrnet: Physics-informed convolutional-recurrent network for solving spatiotemporal pdes. Computer Methods in Applied Mechanics and Engineering, Computer Methods in Applied Mechanics and Engineering, Jun 2021.
  37. Walter Ritz. Über eine neue methode zur lösung gewisser variationsprobleme der mathematischen physik. 1909.
  38. High-resolution image synthesis with latent diffusion models. 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 10674–10685, 2021.
  39. U-net: Convolutional networks for biomedical image segmentation. CoRR, abs/1505.04597, 2015.
  40. H. Schlicting. Boundary layer theory. Mcgraw-hill, 1960.
  41. P. N. Shankar and MD Deshpande. Fluid mechanics in the driven cavity. Annual Review of Fluid Mechanics, 32(1), 2000.
  42. Metnet: A neural weather model for precipitation forecasting. Submission to journal, 2020.
  43. Allen Taflove. Computational electrodynamics the finite-difference time-domain method. 1995.
  44. Pdebench: An extensive benchmark for scientific machine learning. Advances in Neural Information Processing Systems, 35:1596–1611, 2022.
  45. Enhanced deeponet for modeling partial differential operators considering multiple input functions. 2022.
  46. Svd perspectives for augmenting deeponet flexibility and interpretability. Computer Methods in Applied Mechanics and Engineering, 2023.
  47. An introduction to computational fluid dynamics - the finite volume method. 2007.
  48. Improved architectures and training algorithms for deep operator networks, 2021.
  49. Learning the solution operator of parametric partial differential equations with physics-informed deeponets. Science Advances, 7, 2021.
  50. Koopman neural operator as a mesh-free solver of non-linear partial differential equations. arXiv preprint arXiv:2301.10022, 2023.
  51. Megaflow2d: A parametric dataset for machine learning super-resolution in computational fluid dynamics simulations. In Proceedings of Cyber-Physical Systems and Internet of Things Week 2023, CPS-IoT Week ’23, page 100–104, New York, NY, USA, 2023. Association for Computing Machinery.
  52. Transfer learning enhanced deeponet for long-time prediction of evolution equations. 2022.
  53. Unpaired image-to-image translation using cycle-consistent adversarial networks. In 2017 IEEE International Conference on Computer Vision (ICCV), Oct 2017.
  54. The finite element method. 1977.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yining Luo (1 paper)
  2. Yingfa Chen (11 papers)
  3. Zhen Zhang (384 papers)
Citations (2)

Summary

  • The paper introduces CFDBench, featuring over 302,000 CFD frames from four canonical flow problems to test neural network generalization.
  • It evaluates various architectures, including U-Net and FNO, in both autoregressive and non-autoregressive settings to address real-world CFD complexities.
  • The study highlights the importance of matching model designs with CFD problem characteristics to achieve faster, more reliable inference.

Analysis of CFDBench: A Benchmark for Evaluating Neural Operators in Fluid Dynamics

The paper "CFDBench: A Large-Scale Benchmark for Machine Learning Methods in Fluid Dynamics" presents CFDBench, a benchmark designed to evaluate the generalization ability of neural networks in computational fluid dynamics (CFD). The benchmark includes data from four canonical CFD problems: lid-driven cavity flow, laminar boundary layer flow in circular tubes, dam flows over obstacles, and periodic Karman vortex streets. The dataset contains over 302,000 frames of velocity and pressure fields, sampled across different parameters like boundary conditions, fluid properties, and geometries, which are critical for testing a model’s ability to generalize to unseen fluid dynamics scenarios.

Context and Motivation

The motivation behind creating CFDBench stems from the unique challenges posed by traditional numerical methods employed in solving CFD problems, including high computational costs and complex geometry handling requirements. Although deep learning approaches lack certain aspects of accuracy compared to traditional methods, they offer significant time advantages during inference. The construction of CFDBench enables the examination of neural operators' generalization without the need for retraining, a critical step towards practical adoption in industry applications.

Methods Evaluated

The paper evaluates several well-known architectures in the context of CFD:

  • Feed-Forward Networks (FFNs) and variants in both autoregressive and non-autoregressive settings
  • DeepONet and enhancements such as Auto-EDeepONet
  • ResNet and U-Net, well-established architectures in image processing
  • FNO (Fourier Neural Operator) known for leveraging frequency domain transformations

The methods varied in terms of how they incorporate operating parameters in problem settings, such as boundary conditions and fluid properties, directly influencing model performance and ease of adaptation.

Key Findings

A notable insight from the experiments was the importance of matching model architectures with the intrinsic nature of the CFD problems. For instance, U-Net showed superior performance on tasks without source terms such as gravity, while FNO excelled in contexts involving periodic eddy currents, owing to its frequency domain capabilities. The results highlighted a general inadequacy of current data-driven methods to completely replace traditional methods, given their limitations in generalizing to complex and dynamically varying conditions.

Implications and Future Work

CFDBench represents a significant stride towards standardized evaluation protocols for applying deep learning to CFD. Its design facilitates comparative analysis across varying conditions, thus advancing the field's understanding of model behaviors in unseen physics-driven scenarios. As future research delves deeper into developing neural architectures that enhance generalization capabilities, CFDBench can serve as a critical resource for benchmarking such advancements.

Practical implications of the paper extend to various industries relying on fluid dynamics simulations. Improved methods of simulation—especially those leveraging faster deep learning inferences—can lead to more efficient design processes in areas like aerodynamics, meteorology, and hydraulics.

Conclusion

The introduction of CFDBench is positioned to have a significant impact on accelerating the adoption of machine learning methods in CFD research. However, the notable discrepancies in error rates across different fluid dynamics scenarios underscore the necessity for continued research and the development of novel architectures capable of capturing the complex dynamics present in real-world applications. The observations made offer foundational insights into which architectures are currently best suited to various fluid dynamics challenges and pave the way for the continuous improvement of machine learning models in physics-based simulations. CFDBench, with its emphasis on evaluating generalization to diverse, unseen conditions, establishes a solid framework for future explorations in this increasingly critical intersection of fluid dynamics and machine learning.

Github Logo Streamline Icon: https://streamlinehq.com