Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
Gemini 2.5 Pro Pro
o3 Pro
GPT-4.1 Pro
DeepSeek R1 via Azure Pro
2000 character limit reached

Towards an Efficient Shifted Cholesky QR for Applications in Model Order Reduction using pyMOR (2507.07788v1)

Published 10 Jul 2025 in math.NA and cs.NA

Abstract: Many model order reduction (MOR) methods rely on the computation of an orthonormal basis of a subspace onto which the large full order model is projected. Numerically, this entails the orthogonalization of a set of vectors. The nature of the MOR process imposes several requirements for the orthogonalization process. Firstly, MOR is oftentimes performed in an adaptive or iterative manner, where the quality of the reduced order model, i.e., the dimension of the reduced subspace, is decided on the fly. Therefore, it is important that the orthogonalization routine can be executed iteratively. Secondly, one possibly has to deal with high-dimensional arrays of abstract vectors that do not allow explicit access to entries, making it difficult to employ so-called orthogonal triangularization algorithms' such as Householder QR. For these reasons, (modified) Gram-Schmidt-type algorithms are commonly used in MOR applications. These methods belong to the category oftriangular orthogonalization' algorithms that do not rely on elementwise access to the vectors and can be easily updated. Recently, algorithms like shifted Cholesky QR have gained attention. These also belong to the aforementioned category and have proven their aptitude for MOR algorithms in previous studies. A key benefit of these methods is that they are communication-avoiding, leading to vastly superior performance on memory-bandwidth-limited problems and parallel or distributed architectures. This work formulates an efficient updating scheme for Cholesky QR algorithms and proposes an improved shifting strategy for highly ill-conditioned matrices. The proposed algorithmic extensions are validated with numerical experiments on a laptop and computation server.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.