Papers
Topics
Authors
Recent
2000 character limit reached

Nonlinear Model Order Reduction via Lifting Transformations and Proper Orthogonal Decomposition

Published 6 Aug 2018 in cs.NA, cs.SY, and math.DS | (1808.02086v2)

Abstract: This paper presents a structure-exploiting nonlinear model reduction method for systems with general nonlinearities. First, the nonlinear model is lifted to a model with more structure via variable transformations and the introduction of auxiliary variables. The lifted model is equivalent to the original model---it uses a change of variables, but introduces no approximations. When discretized, the lifted model yields a polynomial system of either ordinary differential equations or differential algebraic equations, depending on the problem and lifting transformation. Proper orthogonal decomposition (POD) is applied to the lifted models, yielding a reduced-order model for which all reduced-order operators can be pre-computed. Thus, a key benefit of the approach is that there is no need for additional approximations of nonlinear terms, in contrast with existing nonlinear model reduction methods requiring sparse sampling or hyper-reduction. Application of the lifting and POD model reduction to the FitzHugh-Nagumo benchmark problem and to a tubular reactor model with Arrhenius reaction terms shows that the approach is competitive in terms of reduced model accuracy with state-of-the-art model reduction via POD and discrete empirical interpolation, while having the added benefits of opening new pathways for rigorous analysis and input-independent model reduction via the introduction of the lifted problem structure.

Citations (109)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.