Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Flexibly Enlarged Conjugate Gradient Methods (2305.19013v2)

Published 30 May 2023 in math.NA and cs.NA

Abstract: Enlarged Krylov subspace methods and their s-step versions were introduced [7] in the aim of reducing communication when solving systems of linear equations Ax = b. These enlarged CG methods consist of enlarging the Krylov subspace by a maximum of t vectors per iteration based on the domain decomposition of the graph of A. As for the s-step versions, s iterations of the enlarged Conjugate Gradient methods are merged in one iteration. The Enlarged CG methods and their s-step versions converge in less iterations than the classical CG, but at the expense of requiring more memory storage than CG. Thus, in this paper we explore different options for reducing the memory requirements of these enlarged CG methods without affecting much their convergence.

Summary

We haven't generated a summary for this paper yet.