Papers
Topics
Authors
Recent
2000 character limit reached

Runtime Analysis of Quality Diversity Algorithms

Published 30 May 2023 in cs.NE | (2305.18966v2)

Abstract: Quality diversity~(QD) is a branch of evolutionary computation that gained increasing interest in recent years. The Map-Elites QD approach defines a feature space, i.e., a partition of the search space, and stores the best solution for each cell of this space. We study a simple QD algorithm in the context of pseudo-Boolean optimisation on the ``number of ones'' feature space, where the $i$th cell stores the best solution amongst those with a number of ones in $[(i-1)k, ik-1]$. Here $k$ is a granularity parameter $1 \leq k \leq n+1$. We give a tight bound on the expected time until all cells are covered for arbitrary fitness functions and for all $k$ and analyse the expected optimisation time of QD on \textsc{OneMax} and other problems whose structure aligns favourably with the feature space. On combinatorial problems we show that QD finds a ${(1-1/e)}$-approximation when maximising any monotone sub-modular function with a single uniform cardinality constraint efficiently. Defining the feature space as the number of connected components of a connected graph, we show that QD finds a minimum spanning tree in expected polynomial time.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. Jakob Bossek and Frank Neumann. 2022. Exploring the Feature Space of TSP Instances Using Quality Diversity. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’22). Association for Computing Machinery, New York, NY, USA, 186–194. https://doi.org/10.1145/3512290.3528851
  2. Jakob Bossek and Dirk Sudholt. 2023. Runtime Analysis of Quality Diversity Algorithms. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2023).
  3. Quality-Diversity Optimization: A Novel Branch of Stochastic Optimization. In Black Box Optimization, Machine Learning, and No-Free Lunch Theorems, Panos M. Pardalos, Varvara Rasskazova, and Michael N. Vrahatis (Eds.). Springer International Publishing, Cham, 109–135. https://doi.org/10.1007/978-3-030-66515-9_4
  4. Introduction to Algorithms, Third Edition (3rd ed.). The MIT Press.
  5. Design and Analysis of Diversity-Based Parent Selection Schemes for Speeding Up Evolutionary Multi-objective Optimisation. Theoretical Computer Science 832 (2020), 123–142. https://doi.org/10.1016/j.tcs.2018.06.009
  6. A. Cully and J. B. Mouret. 2016. Evolving a Behavioral Repertoire for a Walking Robot. Evolutionary Computation 24, 1 (2016), 59–88. https://doi.org/10.1162/EVCO_a_00143
  7. Kalyanmoy Deb. 2012. Optimization for Engineering Design - Algorithms and Examples, Second Edition. PHI Learning Private Limited.
  8. Analysis of Evolutionary Diversity Optimisation for Permutation Problems. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’21). Association for Computing Machinery, New York, NY, USA, 574–582. https://doi.org/10.1145/3449639.3459313
  9. Benjamin Doerr. 2020. Theory of Evolutionary Computation: Recent Developments in Discrete Optimization. Springer, Chapter Probabilistic Tools for the Analysis of Randomized Optimization Heuristics, 1–87.
  10. Mutation Rate Matters Even When Optimizing Monotonic Functions. Evolutionary Computation 21, 1 (2013), 1–21. https://doi.org/10.1162/EVCO_a_00055
  11. Simulated annealing is a polynomial-time approximation scheme for the minimum spanning tree problem. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’22), Jonathan E. Fieldsend and Markus Wagner (Eds.). ACM, 1381–1389. https://doi.org/10.1145/3512290.3528812
  12. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science 276, 1–2 (2002), 51–81. https://doi.org/10.1016/S0304-3975(01)00182-7
  13. Tobias Friedrich and Frank Neumann. 2014. Maximizing Submodular Functions under Matroid Constraints by Multi-objective Evolutionary Algorithms. In Parallel Problem Solving from Nature (PPSN ’14), Thomas Bartz-Beielstein, Jürgen Branke, Bogdan Filipič, and Jim Smith (Eds.). Springer International Publishing, Cham, 922–931. https://doi.org/10.1007/978-3-319-10762-2_91
  14. Tobias Friedrich and Frank Neumann. 2015. Maximizing Submodular Functions under Matroid Constraints by Evolutionary Algorithms. Evolutionary Computation 23, 4 (12 2015), 543–558. https://doi.org/10.1162/EVCO_a_00159
  15. Analysis of Diversity-Preserving Mechanisms for Global Exploration*. Evolutionary Computation 17, 4 (2009), 455–476. https://doi.org/10.1162/evco.2009.17.4.17401
  16. Jens Jägersküpper and Tobias Storch. 2007. When the Plus Strategy Outperforms the Comma Strategy and When Not. In Proceedings of the IEEE Symposium on Foundations of Computational Intelligence (FOCI). IEEE, 25–32. https://doi.org/10.1109/FOCI.2007.372143
  17. Self-adjusting Population Sizes for the (1, λ𝜆\lambdaitalic_λ)-EA on Monotone Functions. In Parallel Problem Solving from Nature (PPSN ’22) (Lecture Notes in Computer Science, Vol. 13399), Günter Rudolph, Anna V. Kononova, Hernán E. Aguirre, Pascal Kerschke, Gabriela Ochoa, and Tea Tusar (Eds.). Springer, 569–585. https://doi.org/10.1007/978-3-031-14721-0_40
  18. Johannes Lengler. 2020. A General Dichotomy of Evolutionary Algorithms on Monotone Functions. IEEE Transactions on Evolutionary Computation 24, 6 (2020), 995–1009. https://doi.org/10.1109/TEVC.2019.2917014
  19. Johannes Lengler and Angelika Steger. 2018. Drift Analysis and Evolutionary Algorithms Revisited. Combinatorics, Probability & Computing 27, 4 (2018), 643–666. https://doi.org/10.1017/S0963548318000275
  20. Johannes Lengler and Xun Zou. 2021. Exponential slowdown for larger populations: The (μ𝜇\muitalic_μ+1)-EA on monotone functions. Theoretical Computer Science 875 (2021), 28–51. https://doi.org/10.1016/j.tcs.2021.03.025
  21. Jean-Baptiste Mouret and J. Clune. 2015. Illuminating Search Spaces by Mapping Elites. ArXiv (2015).
  22. An Analysis of Approximations for Maximizing Submodular Set Functions–I. Math. Program. 14, 1 (1978), 265–294. https://doi.org/10.1007/BF01588971
  23. Frank Neumann and Ingo Wegener. 2005a. Minimum Spanning Trees Made Easier via Multi-Objective Optimization. In Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO ’05). ACM, New York, NY, USA, 763–769. https://doi.org/10.1145/1068009.1068139
  24. Frank Neumann and Ingo Wegener. 2005b. Minimum Spanning Trees Made Easier via Multi-Objective Optimization. In Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation (Washington DC, USA) (GECCO ’05). Association for Computing Machinery, New York, NY, USA, 763–769. https://doi.org/10.1145/1068009.1068139
  25. Frank Neumann and Ingo Wegener. 2006. Minimum Spanning Trees Made Easier Via Multi-Objective Optimization. Natural Computing 5, 3 (2006), 305–319. https://doi.org/10.1007/s11047-006-9004-x
  26. Frank Neumann and Carsten Witt. 2010a. Ant Colony Optimization and the minimum spanning tree problem. Theoretical Computer Science 411, 25 (2010), 2406–2413. https://doi.org/10.1016/j.tcs.2010.02.012
  27. F. Neumann and C. Witt. 2010b. Bioinspired Computation in Combinatorial Optimization: Algorithms and Their Computational Complexity. Springer Berlin Heidelberg.
  28. On the Use of Quality Diversity Algorithms for the Traveling Thief Problem. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’22). Association for Computing Machinery, New York, NY, USA, 260–268. https://doi.org/10.1145/3512290.3528752
  29. Analysis of Quality Diversity Algorithms for the Knapsack Problem. In Parallel Problem Solving from Nature (PPSN ’22). Springer-Verlag, Berlin, Heidelberg, 413–427. https://doi.org/10.1007/978-3-031-14721-0_29
  30. Mike Preuss. 2015. Multimodal Optimization by Means of Evolutionary Algorithms. Springer International Publishing, Cham. https://doi.org/10.1007/978-3-319-07407-8
  31. Adam Prügel-Bennett. 2004. When a genetic algorithm outperforms hill-climbing. Theoretical Computer Science 320, 1 (2004), 135 – 153. https://doi.org/10.1016/j.tcs.2004.03.038
  32. Amirhossein Rajabi and Carsten Witt. 2021. Stagnation Detection in Highly Multimodal Fitness Landscapes. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’21). Association for Computing Machinery, New York, NY, USA, 1178–1186. https://doi.org/10.1145/3449639.3459336
  33. Joachim Reichel and Martin Skutella. 2009. On the Size of Weights in Randomized Search Heuristics. In Proceedings of the Tenth ACM SIGEVO Workshop on Foundations of Genetic Algorithms (FOGA ’09). Association for Computing Machinery, New York, NY, USA, 21–28. https://doi.org/10.1145/1527125.1527130
  34. Tamara Ulrich and Lothar Thiele. 2011. Maximizing Population Diversity in Single-Objective Optimization. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’11). Association for Computing Machinery, New York, NY, USA, 641–648. https://doi.org/10.1145/2001576.2001665
  35. Ingo Wegener. 2005. Simulated annealing beats Metropolis in combinatorial optimization. In Proceedings of the 32nd International Colloquium on Automata, Languages and Programming (ICALP ’05) (LNCS, Vol. 3580). 589–601. https://doi.org/10.1007/11523468_48
  36. Carsten Witt. 2013. Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions. Combinatorics, Probability & Computing 22, 2 (2013), 294–318. https://doi.org/10.1017/S0963548312000600
  37. A First Mathematical Runtime Analysis of the Non-dominated Sorting Genetic Algorithm II (NSGA-II). Proceedings of the AAAI Conference on Artificial Intelligence 36, 9 (2022), 10408–10416. https://doi.org/10.1609/aaai.v36i9.21283
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.