Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Block-Coordinate Descent EMO Algorithm: Theoretical and Empirical Analysis (2404.03838v2)

Published 4 Apr 2024 in cs.NE and cs.AI

Abstract: We consider whether conditions exist under which block-coordinate descent is asymptotically efficient in evolutionary multi-objective optimization, addressing an open problem. Block-coordinate descent, where an optimization problem is decomposed into $k$ blocks of decision variables and each of the blocks is optimized (with the others fixed) in a sequence, is a technique used in some large-scale optimization problems such as airline scheduling, however its use in multi-objective optimization is less studied. We propose a block-coordinate version of GSEMO and compare its running time to the standard GSEMO algorithm. Theoretical and empirical results on a bi-objective test function, a variant of LOTZ, serve to demonstrate the existence of cases where block-coordinate descent is faster. The result may yield wider insights into this class of algorithms.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (44)
  1. Anne Auger and Benjamin Doerr, editors. Theory of Randomized Search Heuristics. World Scientific Publishing, 2011.
  2. On the convergence of block coordinate descent type methods. SIAM Journal on Optimization, 23(4):2037–2060, 2013.
  3. SMS-EMOA: Multiobjective selection based on dominated hypervolume. European Journal of Operational Research, 181:1653–1669, 2007.
  4. Stochastic population update can provably be helpful in multi-objective evolutionary algorithms. In International Joint Conference on Artificial Intelligence, IJCAI 2023, pages 5513–5521. ijcai.org, 2023.
  5. On the effects of adding objectives to plateau functions. IEEE Transactions on Evolutionary Computation, 13(3):591–603, 2009.
  6. Deep optimisation: Transitioning the scale of evolutionary search by inducing and searching in deep representations. SN Computer Science, 3(3):253, 2022.
  7. The first proven performance guarantees for the Non-Dominated Sorting Genetic Algorithm II (NSGA-II) on a combinatorial optimization problem. In International Joint Conference on Artificial Intelligence, IJCAI 2023, pages 5522–5530. ijcai.org, 2023.
  8. Carlos A Coello. An updated survey of ga-based multiobjective optimization techniques. ACM Computing Surveys (CSUR), 32(2):109–143, 2000.
  9. The good of the many outweighs the good of the one: Evolutionary multi-objective optimization. IEEE Connections Newsletter, 1(1):9–13, 2003.
  10. Modeling and decomposition — biobjective block-coordinate descent. Dagstuhl Reports, 13(9):45–55, 2024.
  11. Kalyanmoy Deb. Multi-objective evolutionary algorithms. In Springer Handbook of Computational Intelligence, Springer Handbooks, pages 995–1015. Springer, 2015.
  12. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6(2):182–197, 2002.
  13. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: Solving problems with box constraints. IEEE Transactions on Evolutionary Computation, 18(4):577–601, 2014.
  14. Runtime analyses of multi-objective evolutionary algorithms in the presence of noise. In International Joint Conference on Artificial Intelligence, IJCAI 2023, pages 5549–5557. ijcai.org, 2023.
  15. Rigorous runtime analysis of MOEA/d for solving multi-objective minimum weight base problems. In Advances in Neural Information Processing Systems, NeurIPS 2023, 2023.
  16. Benjamin Doerr. Analyzing randomized search heuristics via stochastic domination. Theoretical Computer Science, 773:115–137, 2019.
  17. Benjamin Doerr. Probabilistic tools for the analysis of randomized optimization heuristics. In Benjamin Doerr and Frank Neumann, editors, Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, pages 1–87. Springer, 2020.
  18. Benjamin Doerr. Runtime analysis of evolutionary algorithms via symmetry arguments. Information Processing Letters, 166:106064, 2021.
  19. A method to derive fixed budget results from expected optimisation times. In Genetic and Evolutionary Computation Conference, GECCO 2013, pages 1581–1588. ACM, 2013.
  20. Lower bounds for the runtime of a global multi-objective evolutionary algorithm. In IEEE Congress on Evolutionary Computation, CEC 2013, pages 432–439. IEEE, 2013.
  21. Benjamin Doerr and Frank Neumann, editors. Theory of Evolutionary Computation—Recent Developments in Discrete Optimization. Springer, 2020.
  22. When do evolutionary algorithms optimize separable functions in parallel? In Foundations of Genetic Algorithms, FOGA 2013, pages 48–59. ACM, 2013.
  23. Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In Conference on Artificial Intelligence, AAAI 2021, pages 12293–12301. AAAI Press, 2021.
  24. Approximating covering problems by randomized search heuristics using multi-objective models. Evolutionary Computation, 18:617–633, 2010.
  25. Analysis of (1+1) EA on leadingones with constraints. In Genetic and Evolutionary Computation Conference, GECCO 2023, pages 1584–1592. ACM, 2023.
  26. Oliver Giel. Expected runtimes of a simple multi-objective evolutionary algorithm. In IEEE Congress on Evolutionary Computation, CEC 2003, pages 1918–1925. IEEE, 2003.
  27. On the effect of populations in evolutionary multi-objective optimization. In Genetic and Evolutionary Computation Conference, GECCO 2006, pages 651–658. ACM, 2006.
  28. Approximating pareto-optimal sets using diversity strategies in evolutionary multi-objective optimization. In Advances in multi-objective nature inspired computing, volume 272 of Studies in Computational Intelligence, pages 23–44. Springer, 2010.
  29. Running time analysis of MOEA/D with crossover on discrete optimization problem. In Conference on Artificial Intelligence, AAAI 2019, pages 2296–2303. AAAI Press, 2019.
  30. A runtime analysis of typical decomposition approaches in MOEA/D framework for many-objective optimization problems. In International Joint Conference on Artificial Intelligence, IJCAI 2021, pages 1682–1688. ijcai.org, 2021.
  31. The blockwise coordinate descent method for integer programs. Mathematical Methods of Operations Research, 91(2):357–381, 2020.
  32. Thomas Jansen. Analyzing Evolutionary Algorithms – The Computer Science Perspective. Springer, 2013.
  33. The benefit of migration in parallel evolutionary algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2010, pages 1105–1112. ACM, 2010.
  34. Bioinspired Computation in Combinatorial Optimization – Algorithms and Their Computational Complexity. Springer, 2010.
  35. Runtime analyses of nsga-iii on many-objective problems. In Genetic and Evolutionary Computation Conference, GECCO 2024. ACM, 2024. To appear.
  36. Pareto optimization for subset selection with dynamic cost constraints. Artificical Intelligence, 302:103597, 2022.
  37. Günter Rudolph. Convergence Properties of Evolutionary Algorithms. Verlag Dr. Kovǎc, 1997.
  38. Anita Schöbel. An eigenmodel for iterative line planning, timetabling and vehicle scheduling in public transportation. Transportation Research Part C: Emerging Technologies, 74:348–365, 2017.
  39. A mathematical runtime analysis of the Non-dominated Sorting Genetic Algorithm III (NSGA-III). In International Joint Conference on Artificial Intelligence, IJCAI 2023, pages 5657–5665. ijcai.org, 2023.
  40. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation, 11(6):712–731, 2007.
  41. Mathematical runtime analysis for the non-dominated sorting genetic algorithm II (NSGA-II). Artificial Intelligence, 325:104016, 2023.
  42. Runtime analysis of the SMS-EMOA for many-objective optimization. In Conference on Artificial Intelligence, AAAI 2024. AAAI Press, 2024.
  43. Evolutionary Learning: Advances in Theories and Algorithms. Springer, 2019.
  44. SPEA2: Improving the strength pareto evolutionary algorithm. TIK-report, 103, 2001.

Summary

We haven't generated a summary for this paper yet.