Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Preserving Lagrangian structure in data-driven reduced-order modeling of large-scale dynamical systems (2203.06361v3)

Published 12 Mar 2022 in math.NA and cs.NA

Abstract: This work presents a nonintrusive physics-preserving method to learn reduced-order models (ROMs) of Lagrangian systems, which includes nonlinear wave equations. Existing intrusive projection-based model reduction approaches construct structure-preserving Lagrangian ROMs by projecting the Euler-Lagrange equations of the full-order model (FOM) onto a linear subspace. This Galerkin projection step requires complete knowledge about the Lagrangian operators in the FOM and full access to manipulate the computer code. In contrast, the proposed Lagrangian operator inference approach embeds the mechanics into the operator inference framework to develop a data-driven model reduction method that preserves the underlying Lagrangian structure. The proposed approach exploits knowledge of the governing equations (but not their discretization) to define the form and parametrization of a Lagrangian ROM which can then be learned from projected snapshot data. The method does not require access to FOM operators or computer code. The numerical results demonstrate Lagrangian operator inference on an Euler-Bernoulli beam model, the sine-Gordon (nonlinear) wave equation, and a large-scale discretization of a soft robot fishtail with 779,232 degrees of freedom. The learned Lagrangian ROMs generalize well, as they can accurately predict the physical solutions both far outside the training time interval, as well as for unseen initial conditions.

Citations (25)

Summary

We haven't generated a summary for this paper yet.