Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Enabling fast convergence of the iterated penalty Picard iteration with $O(1)$ penalty parameter for incompressible Navier-Stokes via Anderson acceleration (2105.09339v1)

Published 19 May 2021 in math.NA and cs.NA

Abstract: This paper considers an enhancement of the classical iterated penalty Picard (IPP) method for the incompressible Navier-Stokes equations, where we restrict our attention to $O(1)$ penalty parameter, and Anderson acceleration (AA) is used to significantly improve its convergence properties. After showing the fixed point operator associated with the IPP iteration is Lipschitz continuous and Lipschitz continuously (Frechet) differentiable, we apply a recently developed general theory for AA to conclude that IPP enhanced with AA improves its linear convergence rate by the gain factor associated with the underlying AA optimization problem. Results for several challenging numerical tests are given and show that IPP with penalty parameter 1 and enhanced with AA is a very effective solver.

Citations (8)

Summary

We haven't generated a summary for this paper yet.