Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Neural Network Inspired Formulation of Chemical Kinetics (2008.08483v1)

Published 31 Jul 2020 in physics.chem-ph

Abstract: A method which casts the chemical source term computation into an artificial neural network (ANN)-inspired form is presented. This approach is well-suited for use on emerging supercomputing platforms that rely on graphical processing units (GPUs). The resulting equations allow for a GPU-friendly matrix-multiplication based source term estimation where the leading dimension (batch size) can be interpreted as the number of chemically reacting cells in the domain; as such, the approach can be readily adapted in high-fidelity solvers for which an MPI rank offloads the source term computation task for a given number of cells to the GPU. Though the exact ANN-inspired recasting shown here is optimal for GPU environments as-is, this interpretation allows the user to replace portions of the exact routine with trained, so-called approximate ANNs, where the goal of these approximate ANNs is to increase computational efficiency over the exact routine counterparts. Note that the main objective of this paper is not to use machine learning for developing models, but rather to represent chemical kinetics using the ANN framework. The end result is that little-to-no training is needed, and the GPU-friendly structure of the ANN formulation during the source term computation is preserved. The method is demonstrated using chemical mechanisms of varying complexity on both 0-D auto-ignition and 1-D channel detonation problems, and the details of performance on GPUs are explored.

Summary

We haven't generated a summary for this paper yet.