Papers
Topics
Authors
Recent
2000 character limit reached

An efficient methodology for the analysis and modeling of computer experiments with large number of inputs

Published 24 Apr 2017 in math.ST and stat.TH | (1704.07090v1)

Abstract: Complex computer codes are often too time expensive to be directly used to perform uncertainty, sensitivity, optimization and robustness analyses. A widely accepted method to circumvent this problem consists in replacing cpu-time expensive computer models by cpu inexpensive mathematical functions, called metamodels. For example, the Gaussian process (Gp) model has shown strong capabilities to solve practical problems , often involving several interlinked issues. However, in case of high dimensional experiments (with typically several tens of inputs), the Gp metamodel building process remains difficult, even unfeasible, and application of variable selection techniques cannot be avoided. In this paper, we present a general methodology allowing to build a Gp metamodel with large number of inputs in a very efficient manner. While our work focused on the Gp metamodel, its principles are fully generic and can be applied to any types of metamodel. The objective is twofold: estimating from a minimal number of computer experiments a highly predictive metamodel. This methodology is successfully applied on an industrial computer code.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.