Papers
Topics
Authors
Recent
2000 character limit reached

FlowBERT: Prompt-tuned BERT for variable flow field prediction (2506.08021v1)

Published 20 May 2025 in cs.LG and physics.flu-dyn

Abstract: This study proposes a universal flow field prediction framework based on knowledge transfer from LLM, addressing the high computational costs of traditional computational fluid dynamics (CFD) methods and the limited cross-condition transfer capability of existing deep learning models. The framework innovatively integrates Proper Orthogonal Decomposition (POD) dimensionality reduction with fine-tuning strategies for pretrained LLM, where POD facilitates compressed representation of flow field features while the fine-tuned model learns to encode system dynamics in state space. To enhance the model's adaptability to flow field data, we specifically designed fluid dynamics-oriented text templates that improve predictive performance through enriched contextual semantic information. Experimental results demonstrate that our framework outperforms conventional Transformer models in few-shot learning scenarios while exhibiting exceptional generalization across various inflow conditions and airfoil geometries. Ablation studies reveal the contributions of key components in the FlowBERT architecture. Compared to traditional Navier-Stokes equation solvers requiring hours of computation, our approach reduces prediction time to seconds while maintaining over 90% accuracy. The developed knowledge transfer paradigm establishes a new direction for rapid fluid dynamics prediction, with potential applications extending to aerodynamic optimization, flow control, and other engineering domains.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.