Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 67 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 128 tok/s Pro
Kimi K2 204 tok/s Pro
GPT OSS 120B 461 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Structure-preserving model reduction of Hamiltonian systems by learning a symplectic autoencoder (2411.13906v1)

Published 21 Nov 2024 in math.NA and cs.NA

Abstract: Evolutionary partial differential equations play a crucial role in many areas of science and engineering. Spatial discretization of these equations leads to a system of ordinary differential equations which can then be solved by numerical time integration. Such a system is often of very high dimension, making the simulation very time consuming. One way to reduce the computational cost is to approximate the large system by a low-dimensional model using a model reduction approach. This master thesis deals with structure-preserving model reduction of Hamiltonian systems by using machine learning techniques. We discuss a nonlinear approach based on the construction of an encoder-decoder pair that minimizes the approximation error and satisfies symplectic constraints to guarantee the preservation of the structure inherent in Hamiltonian systems. More specifically, we study an autoencoder network that learns a symplectic encoder-decoder pair. Symplecticity poses some additional difficulties, as we need to ensure this structure in each network layer. Since these symplectic constraints are described by the (symplectic) Stiefel manifold, we use manifold optimization techniques to ensure the symplecticity of the encoder and decoder. A particular challenge is to adapt the ADAM optimizer to the manifold structure. We present a modified ADAM optimizer that works directly on the Stiefel manifold and compare it to the existing approach based on homogeneous spaces. In addition, we propose several modifications to the network and training setup that significantly improve the performance and accuracy of the autoencoder. Finally, we numerically validate the modified optimizer and different learning configurations on two Hamiltonian systems, the 1D wave equation and the sine-Gordon equation, and demonstrate the improved accuracy and computational efficiency of the presented learning algorithms.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 0 likes.