Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The $L_p$-discrepancy for finite $p>1$ suffers from the curse of dimensionality (2403.07961v2)

Published 12 Mar 2024 in math.NA and cs.NA

Abstract: The $L_p$-discrepancy is a classical quantitative measure for the irregularity of distribution of an $N$-element point set in the $d$-dimensional unit cube. Its inverse for dimension $d$ and error threshold $\varepsilon \in (0,1)$ is the number of points in $[0,1)d$ that is required such that the minimal normalized $L_p$-discrepancy is less or equal $\varepsilon$. It is well known, that the inverse of $L_2$-discrepancy grows exponentially fast with the dimension $d$, i.e., we have the curse of dimensionality, whereas the inverse of $L_{\infty}$-discrepancy depends exactly linearly on $d$. The behavior of inverse of $L_p$-discrepancy for general $p \not\in {2,\infty}$ was an open problem since many years. Recently, the curse of dimensionality for the $L_p$-discrepancy was shown for an infinite sequence of values $p$ in $(1,2]$, but the general result seemed to be out of reach. In the present paper we show that the $L_p$-discrepancy suffers from the curse of dimensionality for all $p$ in $(1,\infty)$ and only the case $p=1$ is still open. This result follows from a more general result that we show for the worst-case error of positive quadrature formulas for an anchored Sobolev space of once differentiable functions in each variable whose first mixed derivative has finite $L_q$-norm, where $q$ is the H\"older conjugate of $p$.

Citations (2)

Summary

We haven't generated a summary for this paper yet.