2000 character limit reached
A note on $L^1$-Convergence of the Empiric Minimizer for unbounded functions with fast growth
Published 8 Mar 2023 in math.ST, stat.ML, and stat.TH | (2303.04444v1)
Abstract: For $V : \mathbb{R}d \to \mathbb{R}$ coercive, we study the convergence rate for the $L1$-distance of the empiric minimizer, which is the true minimum of the function $V$ sampled with noise with a finite number $n$ of samples, to the minimum of $V$. We show that in general, for unbounded functions with fast growth, the convergence rate is bounded above by $a_n n{-1/q}$, where $q$ is the dimension of the latent random variable and where $a_n = o(n\varepsilon)$ for every $\varepsilon > 0$. We then present applications to optimization problems arising in Machine Learning and in Monte Carlo simulation.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.