Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 83 tok/s
Gemini 2.5 Pro 34 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 21 tok/s Pro
GPT-4o 130 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 460 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Autotuning by Changing Directives and Number of Threads in OpenMP using ppOpen-AT (2312.05779v1)

Published 10 Dec 2023 in cs.PF and cs.DC

Abstract: Recently, computers have diversified architectures. To achieve high numerical calculation software performance, it is necessary to tune the software according to the target computer architecture. However, code optimization for each environment is difficult unless it is performed by a specialist who knows computer architectures well. By applying autotuning (AT), the tuning effort can be reduced. Optimized implementation by AT that enhances computer performance can be used even by non-experts. In this research, we propose a technique for AT for programs using open multi-processing (OpenMP). We propose an AT method using an AT language that changes the OpenMP optimized loop and dynamically changes the number of threads in OpenMP according to computational kernels. Performance evaluation was performed using the Fujitsu PRIMEHPC FX100, which is a K-computer type supercomputer installed at the Information Technology Center, Nagoya University. As a result, we found there was a performance increase of 1.801 times that of the original code in a plasma turbulence analysis.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.