A Discrepancy Bound for a Deterministic Acceptance-Rejection Sampler (1307.1185v2)
Abstract: We consider an acceptance-rejection sampler based on a deterministic driver sequence. The deterministic sequence is chosen such that the discrepancy between the empirical target distribution and the target distribution is small. We use quasi-Monte Carlo (QMC) point sets for this purpose. The empirical evidence shows convergence rates beyond the crude Monte Carlo rate of $N{-1/2}$. We prove that the discrepancy of samples generated by the QMC acceptance-rejection sampler is bounded from above by $N{-1/s}$. A lower bound shows that for any given driver sequence, there always exists a target density such that the star discrepancy is at most $N{-2/(s+1)}$. For a general density, whose domain is the real state space $\mathbb{R}{s-1}$, the inverse Rosenblatt transformation can be used to convert samples from the $(s-1)-$dimensional cube to $\mathbb{R}{s-1}$. We show that this transformation is measure preserving. This way, under certain conditions, we obtain the same convergence rate for a general target density defined in $\mathbb{R}{s-1}$. Moreover, we also consider a deterministic reduced acceptance-rejection algorithm recently introduced by Barekat and Caflisch [F. Barekat and R.Caflisch. Simulation with Fluctuation and Singular Rates. ArXiv:1310.4555[math.NA], 2013.]
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.