Papers
Topics
Authors
Recent
2000 character limit reached

Nonlinear p-multigrid preconditioner for implicit time integration of compressible Navier--Stokes equations (2202.09733v1)

Published 20 Feb 2022 in math.NA, cs.NA, and physics.flu-dyn

Abstract: Within the framework of $ p $-adaptive flux reconstruction, we aim to construct efficient polynomial multigrid ($p$MG) preconditioners for implicit time integration of the Navier--Stokes equations using Jacobian-free Newton--Krylov (JFNK) methods. We hypothesise that in pseudo transient continuation (PTC), as the residual drops, the frequency of error modes that dictates the convergence rate gets higher and higher. We apply nonlinear $p$MG solvers to stiff steady problems at low Mach number ($\mathrm{Ma}=10{-3}$) to verify our hypothesis. It is demonstrated that once the residual drops by a few orders of magnitude, improved smoothing on intermediate $ p $-sublevels will not only maintain the stability of $ p $MG at large time steps but also improve the convergence rate. For the unsteady Navier--Stokes equations, we elaborate how to construct nonlinear preconditioners using pseudo transient continuation for the matrix-free generalized minimal residual (GMRES) method used in explicit first stage, singly diagonally implicit Runge--Kutta (ESDIRK) methods, and linearly implicit Rosenbrock--Wanner (ROW) methods. Given that at each time step the initial guess in the nonlinear solver is not distant from the converged solution, we recommend a two-level $p{p_0\text{-}p_0/2} $ or even $ p{p_0\text{-}(p_0-1)} $ $p$-hierarchy for optimal efficiency with a matrix-based smoother on the coarser level based on our hypothesis. It is demonstrated that insufficient smoothing on intermediate $p$-sublevels will deteriorate the performance of $p$MG preconditioner greatly. (See full abstract in the paper.)

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.