Papers
Topics
Authors
Recent
2000 character limit reached

Objective-Sensitive Principal Component Analysis for High-Dimensional Inverse Problems

Published 2 Jun 2020 in cs.LG, math.OC, and stat.ML | (2006.04527v1)

Abstract: We present a novel approach for adaptive, differentiable parameterization of large-scale random fields. If the approach is coupled with any gradient-based optimization algorithm, it can be applied to a variety of optimization problems, including history matching. The developed technique is based on principal component analysis (PCA) but modifies a purely data-driven basis of principal components considering objective function behavior. To define an efficient encoding, Gradient-Sensitive PCA uses an objective function gradient with respect to model parameters. We propose computationally efficient implementations of the technique, and two of them are based on stationary perturbation theory (SPT). Optimality, correctness, and low computational costs of the new encoding approach are tested, verified, and discussed. Three algorithms for optimal parameter decomposition are presented and applied to an objective of 2D synthetic history matching. The results demonstrate improvements in encoding quality regarding objective function minimization and distributional patterns of the desired field. Possible applications and extensions are proposed.

Citations (3)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.