Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hybrid iterative ensemble smoother for history matching of hierarchical models (2206.01116v1)

Published 2 Jun 2022 in stat.CO

Abstract: The choice of the prior model can have a large impact on the ability to assimilate data. In standard applications of ensemble-based data assimilation, all realizations in the initial ensemble are generated from the same covariance matrix with the implicit assumption that this covariance is appropriate for the problem. In a hierarchical approach, the parameters of the covariance function, for example the variance, the orientation of the anisotropy and the ranges in two principal directions, may all be uncertain. Thus, the hierarchical approach is much more robust against model misspecification. In this paper, three approaches to sampling from the posterior for hierarchical parameterizations are discussed: an optimization-based sampling approach (RML), an iterative ensemble smoother (IES), and a novel hybrid of the previous two approaches (hybrid IES). The three approximate sampling methods are applied to a linear-Gaussian inverse problem for which it is possible to compare results with an exact "marginal-then-conditional" approach. Additionally, the IES and the hybrid IES methods are tested on a two-dimensional flow problem with uncertain anisotropy in the prior covariance. The standard IES method is shown to perform poorly in the flow examples because of the poor representation of the local sensitivity matrix by the ensemble-based method. The hybrid method, however, samples well even with a relatively small ensemble size.

Summary

We haven't generated a summary for this paper yet.