Papers
Topics
Authors
Recent
Search
2000 character limit reached

IMEX-RB: a self-adaptive IMEX time integration scheme exploiting the RB method

Published 19 Jun 2025 in math.NA and cs.NA | (2506.16470v1)

Abstract: In this work, we introduce a self-adaptive implicit-explicit (IMEX) time integration scheme, named IMEX-RB, for the numerical integration of systems of ordinary differential equations (ODEs), arising from spatial discretizations of partial differential equations (PDEs) by finite difference methods. Leveraging the Reduced Basis (RB) method, at each timestep we project the high-fidelity problem onto a suitable low-dimensional subspace and integrate its dynamics implicitly. Following the IMEX paradigm, the resulting solution then serves as an educated guess within a full-order explicit step. Notably, compared to the canonical RB method, IMEX-RB neither requires a parametrization of the underlying PDE nor features an offline-online splitting, since the reduced subspace is built dynamically, exploiting the high-fidelity solution history. We present the first-order formulation of IMEX-RB, demonstrating and showcasing its convergence and stability properties. In particular, under appropriate conditions on the method's hyperparameters, IMEX-RB is unconditionally stable. The theoretical analysis is corroborated by numerical experiments performed on representative model problems in two and three dimensions. The results demonstrate that our approach can outperform conventional time integration schemes like backward Euler. Indeed, IMEX-RB yields high-fidelity accurate solutions, provided that its main hyperparameters - namely the reduced basis size and the stability tolerance - are suitably tuned. Moreover, IMEX-RB realizes computational gains over backward Euler for a range of timestep sizes above the forward Euler stability threshold.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.