Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Turbulence closure modeling with data-driven techniques: physical compatibility and consistency considerations (2004.03031v1)

Published 6 Apr 2020 in physics.flu-dyn and physics.comp-ph

Abstract: A recent thrust in turbulence closure modeling research is to incorporate ML elements, such as neural networks, for the purpose of enhancing the predictive capability to a broader class of flows. Such a turbulence closure framework entails solving a system of equations comprised of ML functionals coupled with traditional (physics-based - PB) elements. While combining closure elements from fundamentally different ideologies can lead to unprecedented progress, there are many critical challenges that must be overcome. This study examines three such challenges: (i) Physical compatibility (or lack thereof) between ML and PB constituents of the modeling system of equations; (ii) Internal (self) consistency of the ML training process; and (iii) Formulation of an optimal objective (or loss) function for training. These issues are critically important for generalization of the ML-enhanced methods to predictive computations of complex engineering flows. Training and implementation strategies in current practice that may lead to significant incompatibilities and inconsistencies are identified. Using the simple test case of turbulent channel flow, key deficiencies are highlighted and proposals for mitigating them are investigated. Compatibility constraints are evaluated and it is demonstrated that an iterative training procedure can help ensure certain degree of consistency. In summary, this work develops foundational tenets to guide development of ML-enhanced turbulence closure models.

Summary

We haven't generated a summary for this paper yet.