Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Predicting waves in fluids with deep neural network (2201.06628v4)

Published 17 Jan 2022 in physics.flu-dyn and cs.LG

Abstract: In this paper, we present a deep learning technique for data-driven predictions of wave propagation in a fluid medium. The technique relies on an attention-based convolutional recurrent autoencoder network (AB-CRAN). To construct a low-dimensional representation of wave propagation data, we employ a denoising-based convolutional autoencoder. The AB-CRAN architecture with attention-based long short-term memory cells forms our deep neural network model for the time marching of the low-dimensional features. We assess the proposed AB-CRAN framework against the standard recurrent neural network for the low-dimensional learning of wave propagation. To demonstrate the effectiveness of the AB-CRAN model, we consider three benchmark problems, namely, one-dimensional linear convection, the nonlinear viscous Burgers equation, and the two-dimensional Saint-Venant shallow water system. Using the spatial-temporal datasets from the benchmark problems, our novel AB-CRAN architecture accurately captures the wave amplitude and preserves the wave characteristics of the solution for long time horizons. The attention-based sequence-to-sequence network increases the time-horizon of prediction compared to the standard recurrent neural network with long short-term memory cells. The denoising autoencoder further reduces the mean squared error of prediction and improves the generalization capability in the parameter space.

Citations (23)

Summary

We haven't generated a summary for this paper yet.