Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Randomized Complexity of Vector-Valued Approximation (2306.13697v1)

Published 23 Jun 2023 in math.NA and cs.NA

Abstract: We study the randomized $n$-th minimal errors (and hence the complexity) of vector valued approximation. In a paper by the author [Randomized complexity of parametric integration and the role of adaption I. Finite dimensional case (preprint)] a long-standing problem of Information-Based Complexity was solved: Is there a constant $c>0$ such that for all linear problems $\mathcal{P}$ the randomized non-adaptive and adaptive $n$-th minimal errors can deviate at most by a factor of $c$? That is, does the following hold for all linear $\mathcal{P}$ and $n\in {\mathbb N}$ \begin{equation*} e_n{\rm ran-non} (\mathcal{P})\le ce_n{\rm ran} (\mathcal{P}) \, {\bf ?} \end{equation*} The analysis of vector-valued mean computation showed that the answer is negative. More precisely, there are instances of this problem where the gap between non-adaptive and adaptive randomized minimal errors can be (up to log factors) of the order $n{1/8}$. This raises the question about the maximal possible deviation. In this paper we show that for certain instances of vector valued approximation the gap is $n{1/2}$ (again, up to log factors).

Citations (7)

Summary

We haven't generated a summary for this paper yet.