Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-fidelity modeling with different input domain definitions using Deep Gaussian Processes (2006.15924v1)

Published 29 Jun 2020 in cs.LG and stat.ML

Abstract: Multi-fidelity approaches combine different models built on a scarce but accurate data-set (high-fidelity data-set), and a large but approximate one (low-fidelity data-set) in order to improve the prediction accuracy. Gaussian Processes (GPs) are one of the popular approaches to exhibit the correlations between these different fidelity levels. Deep Gaussian Processes (DGPs) that are functional compositions of GPs have also been adapted to multi-fidelity using the Multi-Fidelity Deep Gaussian process model (MF-DGP). This model increases the expressive power compared to GPs by considering non-linear correlations between fidelities within a Bayesian framework. However, these multi-fidelity methods consider only the case where the inputs of the different fidelity models are defined over the same domain of definition (e.g., same variables, same dimensions). However, due to simplification in the modeling of the low-fidelity, some variables may be omitted or a different parametrization may be used compared to the high-fidelity model. In this paper, Deep Gaussian Processes for multi-fidelity (MF-DGP) are extended to the case where a different parametrization is used for each fidelity. The performance of the proposed multifidelity modeling technique is assessed on analytical test cases and on structural and aerodynamic real physical problems.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Ali Hebbal (4 papers)
  2. Loic Brevault (49 papers)
  3. Mathieu Balesdent (11 papers)
  4. El-Ghazali Talbi (21 papers)
  5. Nouredine Melab (6 papers)
Citations (30)

Summary

We haven't generated a summary for this paper yet.