Mathematics > Numerical Analysis
[Submitted on 21 Nov 2024]
Title:Structure-preserving model reduction of Hamiltonian systems by learning a symplectic autoencoder
View PDFAbstract:Evolutionary partial differential equations play a crucial role in many areas of science and engineering. Spatial discretization of these equations leads to a system of ordinary differential equations which can then be solved by numerical time integration. Such a system is often of very high dimension, making the simulation very time consuming. One way to reduce the computational cost is to approximate the large system by a low-dimensional model using a model reduction approach. This master thesis deals with structure-preserving model reduction of Hamiltonian systems by using machine learning techniques. We discuss a nonlinear approach based on the construction of an encoder-decoder pair that minimizes the approximation error and satisfies symplectic constraints to guarantee the preservation of the structure inherent in Hamiltonian systems. More specifically, we study an autoencoder network that learns a symplectic encoder-decoder pair. Symplecticity poses some additional difficulties, as we need to ensure this structure in each network layer. Since these symplectic constraints are described by the (symplectic) Stiefel manifold, we use manifold optimization techniques to ensure the symplecticity of the encoder and decoder. A particular challenge is to adapt the ADAM optimizer to the manifold structure. We present a modified ADAM optimizer that works directly on the Stiefel manifold and compare it to the existing approach based on homogeneous spaces. In addition, we propose several modifications to the network and training setup that significantly improve the performance and accuracy of the autoencoder. Finally, we numerically validate the modified optimizer and different learning configurations on two Hamiltonian systems, the 1D wave equation and the sine-Gordon equation, and demonstrate the improved accuracy and computational efficiency of the presented learning algorithms.
Submission history
From: Florian Konrad Josef Niggl [view email][v1] Thu, 21 Nov 2024 07:37:07 UTC (9,222 KB)
Current browse context:
math.NA
References & Citations
Bibliographic and Citation Tools
Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)
Code, Data and Media Associated with this Article
alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)
Demos
Recommenders and Search Tools
Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
arXivLabs: experimental projects with community collaborators
arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.
Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.
Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.