Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Time-limited pseudo-optimal H$_2$-model order reduction (1909.10275v4)

Published 23 Sep 2019 in eess.SY and cs.SY

Abstract: A model order reduction algorithm is presented that generates a reduced-order model of the original high-order model, which ensures high-fidelity within the desired time interval. The reduced model satisfies a subset of the first-order optimality conditions for time-limited H$_2$-model reduction problem. The algorithm uses a computationally efficient Krylov subspace-based framework to generate the reduced model, and it is applicable to large-scale systems. The reduced-order model is parameterized to enforce a subset of the first-order optimality conditions in an iteration-free way. We also propose an adaptive framework of the algorithm, which ensures a monotonic decay in error irrespective of the choice of interpolation points and tangential directions. The efficacy of the algorithm is validated on benchmark model reduction problems.

Citations (13)

Summary

We haven't generated a summary for this paper yet.