Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 45 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 24 tok/s Pro
GPT-4o 96 tok/s Pro
Kimi K2 206 tok/s Pro
GPT OSS 120B 457 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

A Multiscale Method for Model Order Reduction in PDE Parameter Estimation (1707.07598v3)

Published 24 Jul 2017 in math.NA

Abstract: Estimating parameters of Partial Differential Equations (PDEs) is of interest in a number of applications such as geophysical and medical imaging. Parameter estimation is commonly phrased as a PDE-constrained optimization problem that can be solved iteratively using gradient-based optimization. A computational bottleneck in such approaches is that the underlying PDEs needs to be solved numerous times before the model is reconstructed with sufficient accuracy. One way to reduce this computational burden is by using Model Order Reduction (MOR) techniques such as the Multiscale Finite Volume Method (MSFV). In this paper, we apply MSFV for solving high-dimensional parameter estimation problems. Given a finite volume discretization of the PDE on a fine mesh, the MSFV method reduces the problem size by computing a parameter-dependent projection onto a nested coarse mesh. A novelty in our work is the integration of MSFV into a PDE-constrained optimization framework, which updates the reduced space in each iteration. We also present a computationally tractable way of differentiating the MOR solution that acknowledges the change of basis. As we demonstrate in our numerical experiments, our method leads to computational savings particularly for large-scale parameter estimation problems and can benefit from parallelization.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.