Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Complexity analysis of quasi continuous level Monte Carlo (2305.15949v2)

Published 25 May 2023 in math.NA and cs.NA

Abstract: Continuous level Monte Carlo is an unbiased, continuous version of the celebrated multilevel Monte Carlo method. The approximation level is assumed to be continuous resulting in a stochastic process describing the quantity of interest. Continuous level Monte Carlo methods allow naturally for samplewise adaptive mesh refinements, which are indicated by goal-oriented error estimators. The samplewise refinement levels are drawn in the estimator from an exponentially-distributed random variable. Unfortunately in practical examples this results in higher costs due to high variance in the samples. In this paper we propose a variant of continuous level Monte Carlo, where a quasi Monte Carlo sequence is utilized to "sample" the exponential random variable. We provide a complexity theorem for this novel estimator and show that this results theoretically and practically in a variance reduction of the whole estimator.

Citations (1)

Summary

We haven't generated a summary for this paper yet.