Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A transformer-based synthetic-inflow generator for spatially-developing turbulent boundary layers (2206.01618v2)

Published 3 Jun 2022 in physics.flu-dyn

Abstract: This study proposes a newly-developed deep-learning-based method to generate turbulent inflow conditions for spatially-developing turbulent boundary layer (TBL) simulations. A combination of a transformer and a multiscale-enhanced super-resolution generative adversarial network is utilized to predict velocity fields of a spatially-developing TBL at various planes normal to the streamwise direction. Datasets of direct numerical simulation (DNS) of flat plate flow spanning a momentum thickness-based Reynolds number, Re_theta = 661.5 - 1502.0, are used to train and test the model. The model shows a remarkable ability to predict the instantaneous velocity fields with detailed fluctuations and reproduce the turbulence statistics as well as spatial and temporal spectra with commendable accuracy as compared with the DNS results. The proposed model also exhibits a reasonable accuracy for predicting velocity fields at Reynolds numbers that are not used in the training process. With the aid of transfer learning, the computational cost of the proposed model is considered to be effectively low. The results demonstrate, for the first time that transformer-based models can be efficient in predicting the dynamics of turbulent flows. It also shows that combining these models with generative adversarial networks-based models can be useful in tackling various turbulence-related problems, including the development of efficient synthetic-turbulent inflow generators.

Summary

We haven't generated a summary for this paper yet.