Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimal Polynomial Smoothers for Parallel AMG (2407.09848v3)

Published 13 Jul 2024 in math.NA and cs.NA

Abstract: In this paper, we explore polynomial accelerators that are well-suited for parallel computations, specifically as smoothers in Algebraic MultiGrid (AMG) preconditioners. These accelerators address a minimax problem, initially formulated in [Lottes, Numer. Lin. Alg. with Appl. 30(6), 2518 (2023)], aiming to achieve an optimal (or near-optimal) bound for a polynomial-dependent constant involved in the AMG V-cycle error bound, without requiring information about the matrices' spectra. Lottes focuses on Chebyshev polynomials of the 4th-kind and defines the relevant recurrence formulas applicable to a general convergent basic smoother. In this paper, we demonstrate the efficacy of these accelerations for large-scale applications on modern GPU-accelerated supercomputers. Furthermore, we formulate a variant of the aforementioned minimax problem, which naturally leads to solutions relying on Chebyshev polynomials of the 1st-kind as accelerators for a basic smoother. For all the polynomial accelerations, we describe efficient GPU kernels for their application and demonstrate their comparable effectiveness on standard benchmarks at very large scales.

Summary

We haven't generated a summary for this paper yet.