Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Coregionalization for the Emulation of Spatial-Temporal Fields (1910.07577v1)

Published 16 Oct 2019 in physics.comp-ph and cs.CE

Abstract: Data-driven surrogate models are widely used for applications such as design optimization and uncertainty quantification, where repeated evaluations of an expensive simulator are required. For most partial differential equation (PDE) simulators, the outputs of interest are often spatial or spatial-temporal fields, leading to very high-dimensional outputs. Despite the success of existing data-driven surrogates for high-dimensional outputs, most methods require a significant number of samples to cover the response surface in order to achieve a reasonable degree of accuracy. This demand makes the idea of surrogate models less attractive considering the high computational cost to generate the data. To address this issue, we exploit the multi-fidelity nature of a PDE simulator and introduce deep coregionalization, a Bayesian non-parametric autoregressive framework for efficient emulation of spatial-temporal fields. To effectively extract the output correlations in the context of multi-fidelity data, we develop a novel dimension reduction technique, residual principal component analysis. Our model can simultaneously capture the rich output correlations and the fidelity correlations and make high-fidelity predictions with only a few expensive, high-fidelity simulation samples. We show the advantages of our model in three canonical PDE models and a fluid dynamics problem. The results show that the proposed method cannot only approximate a simulator with significantly less cost (at bout 10%-25%) but also further improve model accuracy.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Wei Xing (34 papers)
  2. Robert M. Kirby (149 papers)
  3. Shandian Zhe (58 papers)
Citations (3)