Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Newton Method for Hausdorff Approximations of the Pareto Front within Multi-objective Evolutionary Algorithms (2405.05721v2)

Published 9 May 2024 in cs.NE

Abstract: A common goal in evolutionary multi-objective optimization is to find suitable finite-size approximations of the Pareto front of a given multi-objective optimization problem. While many multi-objective evolutionary algorithms have proven to be very efficient in finding good Pareto front approximations, they may need quite a few resources or may even fail to obtain optimal or nearly approximations. Hereby, optimality is implicitly defined by the chosen performance indicator. In this work, we propose a set-based Newton method for Hausdorff approximations of the Pareto front to be used within multi-objective evolutionary algorithms. To this end, we first generalize the previously proposed Newton step for the performance indicator for the treatment of constrained problems for general reference sets. To approximate the target Pareto front, we propose a particular strategy for generating the reference set that utilizes the data gathered by the evolutionary algorithm during its run. Finally, we show the benefit of the Newton method as a post-processing step on several benchmark test functions and different base evolutionary algorithms.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (77)
  1. J. Bader and E. Zitzler. HypE: An Algorithm for Fast Hypervolume-Based Many-Objective Optimization. Evolutionary Computation, 19(1):45–76, 2011.
  2. S. Ben-David and M. Ackerman. Measures of clustering quality: A working set of axioms for clustering. In D. Koller et al., editor, Advances in Neural Information Processing Systems, volume 21. Curran Associates, Inc., 2008.
  3. SMS-EMOA: multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res., 181(3):1653–1669, 2007.
  4. R. Biswas and D. Sharma. Single-loop multi-objective reliability-based design optimization using chaos control theory and shifting vector with differential evolution. Mathematical and Computational Applications, 28(1), 2023.
  5. The averaged Hausdorff distances in multi-objective optimization: A review. Mathematics, 7(10), 2019.
  6. P. A. N. Bosman. On gradients and hybrid evolutionary algorithms for real-valued multiobjective optimization. IEEE Transactions on Evolutionary Computation, 16(1):51–69, 2011.
  7. K. Bringmann and T. Friedrich. Convergence of hypervolume-based archiving algorithms. IEEE Transactions on Evolutionary Computation, 18(5):643 – 657, 2014.
  8. Two-dimensional subset selection for hypervolume and epsilon-indicator. In Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation. Association for Computing Machinery, 2014.
  9. M. Brown and R. E. Smith. Directed multi-objective optimization. International Journal of Computers, Systems, and Signals, 6(1):3–17, 2005.
  10. C. I. Hernández Castellanos and O. Schütze. A bounded archiver for Hausdorff approximations of the Pareto front for multi-objective evolutionary algorithms. Mathematical and Computational Applications, 27(3), 2022. Article Number: 48.
  11. Clustering-based subset selection in evolutionary multiobjective optimization. In 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pages 468–475, 2021.
  12. Newton-like methods for solving underdetermined nonlinear equations with nondifferentiable terms. Journal of Computational and Applied Mathematics, 55(3):311–324, 1994.
  13. Solving multiobjective optimization problems using an artificial immune system. Genetic Programming and Evolvable Machines, 6(2):163–190, Jun 2005.
  14. A study of the parallelization of a coevolutionary multi-objective evolutionary algorithm. In R. Monroy et al, editor, MICAI 2004: Advances in Artificial Intelligence, pages 688–697, Berlin, Heidelberg, 2004. Springer Berlin Heidelberg.
  15. Interview: Kalyanmoy deb talks about formation, development and challenges of the EMO community, important positions in his career, and issues faced getting his works published. Mathematical and Computational Applications, 28(2), 2023.
  16. Evolutionary Algorithms for Solving Multi-Objective Problems. Springer, New York, second edition, 2007.
  17. A new hybrid evolutionary algorithm for the treatment of equality constrained MOPs. Mathematics, 8, 2020.
  18. A radial boundary intersection aided interior point method for multi-objective optimization. Information Sciences, 377:1–16, 2017.
  19. K. Deb. Multi-Objective Optimization using Evolutionary Algorithms. John Wiley & Sons, Chichester, UK, 2001. ISBN 0-471-87339-X.
  20. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput., 6(2):182–197, 2002.
  21. K. Deb and M. Ehrgott. On generalized dominance structures for multi-objective optimization. Mathematical and Computational Applications, 28(5), 2023.
  22. K. Deb and H. Jain. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: Solving problems with box constraints. Transactions on Evolutionary Computation, 18(4):577–601, 2014.
  23. A fast and elitist multiobjective genetic algorithm: NSGA-II. Evolutionary Computation, IEEE Transactions on, 6(2):182–197, 2002.
  24. K. Deb and A. Srinivasan. Innovization: innovating design principles through optimization. In M. Cattolico, editor, Genetic and Evolutionary Computation Conference, GECCO 2006, pages 1629–1636. ACM, 2006.
  25. Scalable Test Problems for Evolutionary Multiobjective Optimization. In A. Abraham et al., editor, Evolutionary Multiobjective Optimization, Advanced Information and Knowledge Processing, pages 105–145. Springer, 2005.
  26. B. Delaunay. Sur la sphère vide. Bulletin de lAcadémie des Sciences de l’URSS. Classe des sciences mathématiques et na, 1934(6):793–800, 1934.
  27. M. Emmerich and A. H. Deutz. Time Complexity and Zeros of the Hypervolume Indicator Gradient Field. In O. Schütze et al., editor, EVOLVE - A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation III [EVOLVE 2012, volume 500 of Studies in Computational Intelligence, pages 169–193. Springer, 2012.
  28. A density-based algorithm for discovering clusters in large spatial databases with noise. In E. Simoudis et al., editor, KDD, pages 226–231. AAAI Press, 1996.
  29. J. Fliege and B. F. Svaiter. Steepest descent methods for multicriteria optimization. Mathematical Methods of Operations Research, 51(3):479–494, 2000.
  30. Fibonacci heaps and their uses in improved network optimization algorithms. J. ACM, 34(3):596–615, jul 1987.
  31. Multi-objective optimization for the periodic operation of the naphtha pyrolysis process using a new parallel hybrid algorithm combining NSGA-II with SQP. Computers & Chemical Engineering, 32(11):2801–2811, 2008.
  32. A. Griewank. Evaluating derivatives - principles and techniques of algorithmic differentiation, volume 19 of Frontiers in Applied Mathematics. SIAM, 2000.
  33. Greedy Hypervolume Subset Selection in Low Dimensions. Evolutionary Computation, 24(3):521–544, 09 2016.
  34. M. P. Hansen and A. Jaszkiewicz. Evaluating the quality of approximations of the non- dominated set. IMM Technical Report IMM-REP-1998-7, Institute of Mathematical Modeling, Technical University of Denmark, 1998.
  35. MOMBI: A new metaheuristic for many-objective optimization based on the R2 indicator. In 2013 IEEE Congress on Evolutionary Computation, pages 2488–2495, 2013.
  36. Failure of Pareto-based MOEAs: Does non-dominated really mean near to optimal? In Evolutionary Computation (CEC), 2001 IEEE Congress on, pages 957–962. IEEE, 2001.
  37. Modified distance calculation in generational distance and inverted generational distance. In A. Gaspar-Cunha et al., editor, Evolutionary Multi-Criterion Optimization, pages 110–125, Cham, 2015. Springer.
  38. Effects of dominance resistant solutions on the performance of evolutionary multi-objective and many-objective algorithms. In GECCO ’20: Proceedings of the 22th annual Conference on Genetic and Evolutionary Computation, pages 507 – 515, 2020.
  39. H. Jain and K. Deb. An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point Based Nondominated Sorting Approach, Part II: Handling Constraints and Extending to an Adaptive Approach. IEEE Trans. Evol. Comput., 18(4):602–622, 2014.
  40. Bounded archiving using the lebesgue measure. In Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2003, Canberra, Australia, December 8-12, 2003, pages 2490–2497. IEEE, 2003.
  41. Hypervolume Subset Selection in Two Dimensions: Formulations and Algorithms. Evolutionary Computation, 24(3):411–425, 2016.
  42. HCS: A new local search strategy for memetic multiobjective evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 14(1):112–132, 2009.
  43. On Sequential Online Archiving of Objective Vectors. In Ricardo H. C. Takahashi, Kalyanmoy Deb, Elizabeth F. Wanner, and Salvatore Greco, editors, Evolutionary Multi-Criterion Optimization - 6th International Conference, EMO 2011, Ouro Preto, Brazil, April 5-8, 2011. Proceedings, volume 6576 of Lecture Notes in Computer Science, pages 46–60. Springer, 2011.
  44. A. Martín and O. Schütze. Pareto tracer: A predictor-corrector method for multi-objective optimization problems. Engineering Optimization, 50(3):516–536, 2018.
  45. LIBEA: A lebesgue indicator-based evolutionary algorithm for multi-objective optimization. Swarm Evol. Comput., 44:404–419, 2019.
  46. J. Nocedal and S. Wright. Numerical Optimization. Springer Science & Business Media, 2006.
  47. Multiobjective evolutionary algorithm based on decomposition for 3-objective optimization problems with objectives in different scales. Soft Comput., 19(1):157–166, 2015.
  48. A multiobjective evolutionary algorithm guided by averaged Hausdorff distance to aspiration sets. In A. A. Tantar et al., editor, EVOLVE - A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation V, pages 261–273. Springer, 2014.
  49. Optimal averaged Hausdorff archives for bi-objective problems: Theoretical and numerical results. Comput Optim Appl, 64:589 – 618, 2016.
  50. Gradient subspace approximation: A direct search method for memetic computing. Soft Computing, 21(21):6331–6350, 2017.
  51. Hybridizing evolutionary strategies with continuation methods for solving multi-objective problems. IEEE Transactions on Evolutionary Computation, 19(7):762 – 769, 2008.
  52. Using the averaged Hausdorff distance as a performance measure in evolutionary multi-objective optimization. IEEE Transactions on Evolutionary Computation, 16(4):504–522, 2012.
  53. O. Schütze and C. Hernández. Archiving Strategies for Evolutionary Multi-objective Optimization Algorithms. Springer, 2021.
  54. The directed search method for multi-objective memetic algorithms. Computational Optimization and Applications, 63(2):305–332, 2016.
  55. Distance-based subset selection revisited. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO ’21, page 439–447, New York, NY, USA, 2021. Association for Computing Machinery.
  56. Benchmarking large-scale subset selection in evolutionary multi-objective optimization. Information Sciences, 622:755–770, 2023.
  57. P. K. Shukla. On gradient based local search methods in unconstrained evolutionary multi-objective optimization. In International Conference on Evolutionary Multi-Criterion Optimization, pages 96–110. Springer, 2007.
  58. Distance-based subset selection for benchmarking in evolutionary multi/many-objective optimization. IEEE Transactions on Evolutionary Computation, 23(5):904–912, 2019.
  59. A. Sinha and J. Wallenius. Mcdm, emo and hybrid approaches: Tutorial and review. Mathematical and Computational Applications, 27(6), 2022.
  60. Sampling uniformly from the unit simplex. 2004.
  61. The set-based hypervolume Newton method for bi-objective optimization. IEEE Transactions on Cybernetics, 50(5):2186 – 2196, 2020.
  62. T. Takahama and S. Sakai. Constrained Optimization by the ϵitalic-ϵ\epsilonitalic_ϵ Constrained Differential Evolution with Gradient-Based Mutation and Feasible Elites. In IEEE International Conference on Evolutionary Computation, CEC 2006, pages 1–8. IEEE, 2006.
  63. R2-EMOA: Focused multiobjective search using R2-indicator-based selection. In G. Nicosia and P. Pardalos, editors, Learning and Intelligent Optimization, pages 70–74. Springer Berlin Heidelberg, 2013.
  64. A set based Newton method for the averaged hausdorff distance for multi-objective reference set problems. Mathematics, 8(10), 2020.
  65. A set based Newton method for the averaged Hausdorff distance for multi-objective reference set problems. Mathematics, 8(10), 2020.
  66. A new gradient free local search mechanism for constrained multi-objective optimization problems. Swarm and Evolutionary Computation, 67:100938, 2021.
  67. D. A. Van Veldhuizen. Multiobjective evolutionary algorithms: classifications, analyses, and new innovations. Technical report, Air Force Institute of Technology, 1999.
  68. A new multi-objective evolutionary algorithm based on a performance assessment indicator. In T. Soule and J. H. Moore, editors, Genetic and Evolutionary Computation Conference, 2012, pages 505–512. ACM, 2012.
  69. The hypervolume Newton method for constrained multi-objective optimization problems. Mathematical and Computational Applications, 28(1), 2023.
  70. Tetsuro Yamamoto. Historical developments in convergence analysis for Newton’s and Newton-like methods. Journal of Computational and Applied Mathematics, 124(1-2):1–23, 2000.
  71. S. Zapotecas Martínez and C. A. Coello Coello. A direct local search mechanism for decomposition-based multi-objective evolutionary algorithms, pages 3431 – 3438. 2012.
  72. A nonlinear simplex search approach for multi-objective optimization. In Evolutionary Computation (CEC), 2011 IEEE Congress on, pages 2367 – 2374, 2011.
  73. Q. Zhang and H. Li. MOEA/D: A multi-objective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation, 11(6):712–731, 2007.
  74. Multiobjective optimization test instances for the CEC 2009 special session and competition. University of Essex, Colchester, UK and Nanyang Technological University, Singapore, special session on performance assessment of multi-objective optimization algorithms, technical report, 264:1–30, 2008.
  75. SPEA2: Improving the Strength Pareto Evolutionary Algorithm for Multiobjective Optimization. In K.C. Giannakoglou et al., editors, Evolutionary Methods for Design, Optimisation and Control with Application to Industrial Problems (EUROGEN 2001), pages 95–100. International Center for Numerical Methods in Engineering (CIMNE), 2002.
  76. Performance assessment of multiobjective optimizers: An analysis and review. IEEE Transactions on Evolutionary Computation, 7(2):117–132, 2003.
  77. Performance assessment of multiobjective optimizers: an analysis and review. IEEE Transactions on Evolutionary Computation, 7(2):117–132, 2003.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Hao Wang (1120 papers)
  2. Angel E. Rodriguez-Fernandez (2 papers)
  3. Lourdes Uribe (1 paper)
  4. André Deutz (6 papers)
  5. Oziel Cortés-Piña (1 paper)
  6. Oliver Schütze (4 papers)

Summary

We haven't generated a summary for this paper yet.