Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Computing Star Discrepancies with Numerical Black-Box Optimization Algorithms (2306.16998v1)

Published 29 Jun 2023 in cs.NE

Abstract: The $L_{\infty}$ star discrepancy is a measure for the regularity of a finite set of points taken from $[0,1)d$. Low discrepancy point sets are highly relevant for Quasi-Monte Carlo methods in numerical integration and several other applications. Unfortunately, computing the $L_{\infty}$ star discrepancy of a given point set is known to be a hard problem, with the best exact algorithms falling short for even moderate dimensions around 8. However, despite the difficulty of finding the global maximum that defines the $L_{\infty}$ star discrepancy of the set, local evaluations at selected points are inexpensive. This makes the problem tractable by black-box optimization approaches. In this work we compare 8 popular numerical black-box optimization algorithms on the $L_{\infty}$ star discrepancy computation problem, using a wide set of instances in dimensions 2 to 15. We show that all used optimizers perform very badly on a large majority of the instances and that in many cases random search outperforms even the more sophisticated solvers. We suspect that state-of-the-art numerical black-box optimization techniques fail to capture the global structure of the problem, an important shortcoming that may guide their future development. We also provide a parallel implementation of the best-known algorithm to compute the discrepancy.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (42)
  1. The Design of OpenMP Tasks. IEEE Transactions on Parallel and Distributed Systems, 20(3):404–418, March 2009.
  2. Evolving sampling strategies for one-shot optimization tasks. In Proc. of Parallel Problem Solving from Nature (PPSN), volume 12269 of LNCS, pages 111–124. Springer, 2020.
  3. Critical hyper-parameters: No random, no cry. CoRR, abs/1706.03200, 2017.
  4. Quality-diversity optimization: a novel branch of stochastic optimization. CoRR, abs/2012.04322, 2020.
  5. Star discrepancy subset selection: Problem formulation and efficient approaches for low dimensions. Journal of Complexity, 70:101645, 2022.
  6. Reproducibility files and additional figures, February 2023. https://doi.org/10.5281/zenodo.7630260.
  7. Computing the discrepancy with applications to supersampling patterns. ACM Trans. Graph., 15:354–376, 1996.
  8. Calculation of discrepancy measures and applications. In W. Chen, A. Srivastav, and G. Travaglini, editors, A Panorama of Discrepancy Theory, pages 621–678. Springer, 2014.
  9. Iohexperimenter: Benchmarking platform for iterative optimization heuristics. CoRR, abs/2111.04077, 2021.
  10. J. Dick and F. Pillichshammer. Digital Nets and Sequences. Cambridge University Press, 2010.
  11. C. Doerr and F.-M. De Rainville. Constructing low star discrepancy point sets with genetic algorithms. In Proc. of Genetic and Evolutionary Computation Conference (GECCO), pages 789–796. ACM, 2013.
  12. G. Dueck and T. Scheuer. Threshold accepting: a general purpose optimization algorithm appearing superior to simulated annealing. J. Comput. Phys., 90:161–175, 1990.
  13. IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics. CoRR, abs/1810.05281, 2018. Available at http://arxiv.org/abs/1810.05281. A more up-to-date documentation of IOHprofiler is available at https://iohprofiler.github.io/.
  14. B. Fox. Algorithm 647:implementation and relative efficiency of quasirandom sequence generators. ACM Transactions on Mathematical Software, 12(4):362–376, 1986.
  15. S. Galanti and A. Jung. Low-discrepancy sequences: Monte-Carlo simulation of option prices. J. Deriv, pages 63–83, 1997.
  16. Hardness of discrepancy computation and ε𝜀\varepsilonitalic_ε-net verification in high dimension. J. Complex., 28:162–176, 2012.
  17. Finding optimal volume subintervals with k points and calculating the star discrepancy are NP-hard problems. J. Complex., 25:115–127, 2009.
  18. A new randomized algorithm to approximate the star discrepancy based on threshold accepting. SIAM J. Numerical Analysis, 50:781–807, 2012.
  19. J. H. Halton. On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals. Numerische Mathematik, 2:84–90, 1960.
  20. E. Hlawka. Funktionen von beschrankter Variation in der Theorie der Gleichverteilung. Ann. Mat. Pum Appl., 54:325–333, 1961.
  21. N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9(2):159–195, 2001.
  22. J. Kennedy and R. Eberhart. Particle swarm optimization. In Proc. of ICNN’95 - International Conference on Neural Networks, volume 4, pages 1942–1948, 1995.
  23. Optimization by simulated annealing. Science, 220:671–680, 1983.
  24. J. F. Koksma. Een algemeene stelling inuit de theorie der gelijkmatige verdeeling modulo 1. Mathematica (Zutphen), 11:7–11, 1942.
  25. P. Larrañaga and J.A. Lozano. Estimation of distribution algorithms: A new tool for evolutionary computation, volume 2. Springer Science & Business Media, 2001.
  26. Black-box optimization revisited: Improving algorithm selection wizards through massive benchmarking. IEEE Transactions on Evolutionary Computation, 26(3):490–500, 2021.
  27. Discrepancy-based evolutionary diversity optimization. In Proc. of Genetic and Evolutionary Computation Conference (GECCO), pages 991–998. ACM, 2018.
  28. Evolutionary diversity optimization using multi-objective indicators. In Proc. of Genetic and Evolutionary Computation Conference (GECCO), pages 837–845. ACM, 2019.
  29. H. Niederreiter. Discrepancy and convex programming. Ann. Mat. Pura Appl., 93:89–97, 1972.
  30. H. Niederreiter. Random Number Generation and Quasi-Monte Carlo Methods. Society for Industrial and Applied Mathematics, 1992.
  31. MatBuilder: Mastering sampling uniformity over projections. ACM Transactions on Graphics (proceedings of SIGGRAPH), 2022.
  32. M. J. D. Powell. A direct search optimization method that models the objective and constraint functions by linear interpolation. In S. Gomez and J.-P. Hennart, editors, Advances in Optimization and Numerical Analysis, pages 51–67. Springer Netherlands, Dordrecht, 1994.
  33. Quality diversity: A new frontier for evolutionary computation. Frontiers Robotics AI, 3:40, 2016.
  34. Confronting the challenge of quality diversity. In Proc. of Genetic and Evolutionary Computation Conference (GECCO), pages 967–974. ACM, 2015.
  35. J. Rapin and O. Teytaud. Nevergrad - A gradient-free optimization platform. https://GitHub.com/FacebookResearch/Nevergrad, 2018.
  36. O. Shir. Niching in evolutionary algorithms. In Grzegorz Rozenberg, Thomas Bäck, and Joost N. Kok, editors, Handbook of Natural Computing, pages 1035–1069. Springer, 2012.
  37. I.M. Sobol. On the distribution of points in a cube and the approximate evaluation of integrals. USSR Computational Mathematics and Mathematical Physics, 7(4):86–112, January 1967.
  38. R. Storn and K. Price. Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization, 11(4):341–359, 1997.
  39. J.C. Spall. Multivariate stochastic approximation using a simultaneous perturbation gradient approximation. IEEE Transactions on Automatic Control, 37(3):332–341, 1992.
  40. The Design and Analysis of Computer Experiments. Springer Series in Statistics, Springer, 2003.
  41. P. Winker and K.T. Fang. Applications of Threshold-Accepting to the evaluation of the discrepancy of a set of points. SIAM J. Numerical Analysis, 34:2028–2042, 1997.
  42. IOHanalyzer: Detailed performance analyses for iterative optimization heuristics. ACM Trans. Evol. Learn. Optim., 2, 2022.
Citations (4)

Summary

We haven't generated a summary for this paper yet.