Sharp Non-Asymptotic Bounds for the Star Discrepancy of Double-Infinite Random Matrices via Optimal Covering Numbers
Abstract: We establish sharp non-asymptotic probabilistic bounds for the star discrepancy of double-infinite random matrices -- a canonical model for sequences of random point sets in high dimensions. By integrating the recently proved \textbf{optimal covering numbers for axis-parallel boxes} (Gnewuch, 2024) into the dyadic chaining framework, we achieve \textbf{explicitly computable constants} that improve upon all previously known bounds. For dimension $d \ge 3$, we prove that with high probability, [ D_Nd \le \sqrt{αA_d + βB \frac{\ln \log_2 N}{d}} \sqrt{\frac{d}{N}}, ] where $A_d$ is given by an explicit series and satisfies $A_3 \le 745$, a \textbf{14\% improvement} over the previous best constant of 868 (Fiedler et al., 2023). For $d=2$, we obtain the currently smallest known constant $A_2 \le 915$. Our analysis reveals a \textbf{precise trade-off} between the dimensional dependence and the logarithmic factor in $N$, highlighting how optimal covering estimates directly translate to tighter discrepancy bounds. These results immediately yield improved error guarantees for \textbf{quasi-Monte Carlo integration, uncertainty quantification, and high-dimensional sampling}, and provide a new benchmark for the probabilistic analysis of geometric discrepancy. \textbf{Keywords:} Star discrepancy, double-infinite random matrices, covering numbers, dyadic chaining, high-dimensional integration, quasi-Monte Carlo, probabilistic bounds.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.