Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
117 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Offsite Autotuning Approach -- Performance Model Driven Autotuning Applied to Parallel Explicit ODE Methods (2004.03695v1)

Published 7 Apr 2020 in cs.PF

Abstract: Autotuning techniques are a promising approach to minimize the otherwise tedious manual effort of optimizing scientific applications for a specific target platform. Ideally, an autotuning approach is capable of reliably identifying the most efficient implementation variant(s) for a new target system or new characteristics of the input by applying suitable program transformations and analytic models. In this work, we introduce Offsite, an offline autotuning approach which automates this selection process at installation time by rating implementation variants based on an analytic performance model without requiring time-consuming runtime experiments. From abstract multilevel YAML description languages, Offsite automatically derives optimized, platform-specific and problem-specific code of possible implementation variants and applies the performance model to these implementation variants. We apply Offsite to parallel numerical methods for ordinary differential equations (ODEs). In particular, we investigate tuning a specific class of explicit ODE solvers (PIRK methods) for various initial value problems (IVPs) on shared-memory systems. Our experiments demonstrate that Offsite is able to reliably identify a set of the most efficient implementation variants for given test configurations (ODE solver, IVP, platform) and is capable of effectively handling important autotuning scenarios.

Citations (1)

Summary

We haven't generated a summary for this paper yet.