Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Numerical Stability of s-step Enlarged Krylov Subspace Conjugate Gradient Methods (1804.10629v1)

Published 27 Apr 2018 in math.NA

Abstract: Recently, enlarged Krylov subspace methods, that consists of enlarging the Krylov subspace by a maximum of t vectors per iteration based on the domain decomposition of the graph of A, were introduced in the aim of reducing communication when solving systems of linear equations Ax=b. In this paper, the s-step enlarged Krylov subspace Conjugate Gradient methods are introduced, whereby s iterations of the enlarged Conjugate Gradient methods are merged in one iteration. The numerical stability of these s-step methods is studied, and several numerically stable versions are proposed. Similarly to the enlarged Krylov subspace methods, the s-step enlarged Krylov subspace methods have a faster convergence than Krylov methods, in terms of iterations. Moreover, by computing st basis vectors of the enlarged Krylov subspace $\mathscr{K}_{k,t}(A,r_0)$ at the beginning of each s-step iteration, communication is further reduced. It is shown in this paper that the introduced methods are parallelizable with less communication, with respect to their corresponding enlarged versions and to Conjugate Gradient.

Summary

We haven't generated a summary for this paper yet.