Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Evolutionary Alternating Direction Method of Multipliers for Constrained Multi-Objective Optimization with Unknown Constraints (2401.00978v1)

Published 2 Jan 2024 in cs.NE

Abstract: Constrained multi-objective optimization problems (CMOPs) pervade real-world applications in science, engineering, and design. Constraint violation has been a building block in designing evolutionary multi-objective optimization algorithms for solving constrained multi-objective optimization problems. However, in certain scenarios, constraint functions might be unknown or inadequately defined, making constraint violation unattainable and potentially misleading for conventional constrained evolutionary multi-objective optimization algorithms. To address this issue, we present the first of its kind evolutionary optimization framework, inspired by the principles of the alternating direction method of multipliers that decouples objective and constraint functions. This framework tackles CMOPs with unknown constraints by reformulating the original problem into an additive form of two subproblems, each of which is allotted a dedicated evolutionary population. Notably, these two populations operate towards complementary evolutionary directions during their optimization processes. In order to minimize discrepancy, their evolutionary directions alternate, aiding the discovery of feasible solutions. Comparative experiments conducted against five state-of-the-art constrained evolutionary multi-objective optimization algorithms, on 120 benchmark test problem instances with varying properties, as well as two real-world engineering optimization problems, demonstrate the effectiveness and superiority of our proposed framework. Its salient features include faster convergence and enhanced resilience to various Pareto front shapes.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (134)
  1. D. L. Thurston and S. Srinivasan, “Constrained optimization for green engineering decision-making,” Environ. Sci. Technol., vol. 37, no. 23, pp. 5389–5397, 2003.
  2. J. Andersson, “Applications of a multi-objective genetic algorithm to engineering design problems,” in EMO’03: Proc. of the Second International Conference on Evolutionary Multi-Criterion Optimization, ser. Lecture Notes in Computer Science, vol. 2632, 2003, pp. 737–751.
  3. A. Ponsich, A. L. Jaimes, and C. A. C. Coello, “A survey on multiobjective evolutionary algorithms for the solution of the portfolio optimization problem and other finance and economics applications,” IEEE Trans. Evol. Comput., vol. 17, no. 3, pp. 321–344, 2013.
  4. J. García and F. Fernández, “A comprehensive survey on safe reinforcement learning,” J. Mach. Learn. Res., vol. 16, pp. 1437–1480, 2015.
  5. K. Deb, S. Agrawal, A. Pratap, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGA-II,” IEEE Trans. Evol. Comput., vol. 6, no. 2, pp. 182–197, 2002.
  6. Q. Zhang and H. Li, “MOEA/D: A multiobjective evolutionary algorithm based on decomposition,” IEEE Trans. Evol. Comput., vol. 11, no. 6, pp. 712–731, 2007.
  7. E. Zitzler and S. Künzli, “Indicator-based selection in multiobjective search,” in PPSN’04: Proc. of 8th International Conference on Parallel Problem Solving from Nature, 2004, pp. 832–842.
  8. C. M. Fonseca and P. J. Fleming, “Multiobjective optimization and multiple constraint handling with evolutionary algorithms. i. A unified formulation,” IEEE Trans. Syst. Man Cybern., Part A, vol. 28, no. 1, pp. 26–37, 1998.
  9. C. A. C. Coello and A. D. Christiansen, “MOSES: A multiobjective optimization tool for engineering design,” Eng. Opt., vol. 31, no. 3, pp. 337–368, 1999.
  10. H. Jain and K. Deb, “An evolutionary many-objective optimization algorithm using reference-point based nondominated sorting approach, part II: handling constraints and extending to an adaptive approach,” IEEE Trans. Evol. Comput., vol. 18, no. 4, pp. 602–622, 2014.
  11. A. Oyama, K. Shimoyama, and K. Fujii, “New constraint-handling method for multi-objective and multi-constraint evolutionary optimization,” Trans. Jpn. Soc. Aeronaut, vol. 50, pp. 56–62, 2007.
  12. T. Takahama and S. Sakai, “Efficient constrained optimization by the ϵitalic-ϵ\epsilonitalic_ϵ constrained rank-based differential evolution,” in CEC’12: Proc. of the 2012 IEEE Congress on Evolutionary Computation, 2012, pp. 1–8.
  13. H. Geng, M. Zhang, L. Huang, and X. Wang, “Infeasible elitists and stochastic ranking selection in constrained evolutionary multi-objective optimization,” in SEAL’06: Proc. of the 2006 Simulated Evolution and Learning International Conference, vol. 4247, 2006, pp. 336–344.
  14. Z. Fan, W. Li, X. Cai, K. Hu, H. Lin, and H. Li, “Angle-based constrained dominance principle in MOEA/D for constrained multi-objective optimization problems,” in CEC’16: Proc. of the 2016 IEEE Congress on Evolutionary Computation, 2016, pp. 460–467.
  15. F. Jiménez, A. F. Gómez-Skarmeta, G. Sánchez, and K. Deb, “An evolutionary algorithm for constrained multi-objective optimization,” in CEC’02: Proc. of the 2002 IEEE Congress on Evolutionary Computation, 2002, pp. 1133–1138.
  16. T. Ray, K. Tai, and K.-C. Seow, “Multiobjective design optimization by an evolutionary algorithm,” Eng. Opt., vol. 33, no. 4, pp. 399–424, 2001.
  17. N. Young, “Blended ranking to cross infeasible regions in constrained multi-objective problems,” in CIMCA’05: Proc. of the 2005 International Conference on Computational Intelligence Modeling, Control and Automation, 2005, pp. 191–196.
  18. K. Li, K. Deb, Q. Zhang, and S. Kwong, “An evolutionary many-objective optimization algorithm based on dominance and decomposition,” IEEE Trans. Evol. Comput., vol. 19, no. 5, pp. 694–716, 2015.
  19. K. Li, R. Chen, G. Fu, and X. Yao, “Two-archive evolutionary algorithm for constrained multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 23, no. 2, pp. 303–315, 2019.
  20. Y. Tian, T. Zhang, J. Xiao, X. Zhang, and Y. Jin, “A coevolutionary framework for constrained multiobjective optimization problems,” IEEE Trans. Evol. Comput., vol. 25, no. 1, pp. 102–116, 2021.
  21. Q. Long, “A constraint handling technique for constrained multi-objective genetic algorithm,” Swarm Evol. Comput., vol. 15, pp. 66–79, 2014.
  22. C. Peng, H. Liu, and F. Gu, “An evolutionary algorithm with directed weights for constrained multi-objective optimization,” Appl. Soft Comput., vol. 60, pp. 613–622, 2017.
  23. X. Yu, X. Yu, Y. Lu, G. G. Yen, and M. Cai, “Differential evolution mutation operators for constrained multi-objective optimization,” Appl. Soft Comput., vol. 67, pp. 452–466, 2018.
  24. B. Xu, W. Duan, H. Zhang, and Z. Li, “Differential evolution with infeasible-guiding mutation operators for constrained multi-objective optimization,” Appl. Intell., vol. 50, no. 12, pp. 4459–4481, 2020.
  25. K. Harada, J. Sakuma, I. Ono, and S. Kobayashi, “Constraint-handling method for multi-objective function optimization: Pareto descent repair operator,” in EMO’06: Proc. of the 4th International Conference on Evolutionary Multi-Criterion Optimization, 2006, pp. 156–170.
  26. L. Jiao, J. Luo, R. Shang, and F. Liu, “A modified objective function method with feasible-guiding strategy to solve constrained multi-objective optimization problems,” Appl. Soft Comput., vol. 14, pp. 363–380, 2014.
  27. H. K. Singh, T. Ray, and W. Smith, “C-PSA: constrained Pareto simulated annealing for constrained multi-objective optimization,” Inf. Sci., vol. 180, no. 13, pp. 2499–2513, 2010.
  28. S. Li, K. Li, and W. Li, “Do we really need to use constraint violation in constrained evolutionary multi-objective optimization?” in PPSN’22: Proc. of the 17th International Conference on Parallel Problem Solving from Nature, ser. Lecture Notes in Computer Science, vol. 13399.   Springer, 2022, pp. 124–137.
  29. M. Schonlau, W. J. Welch, and D. R. Jones, “Global versus local search in constrained optimization of computer models,” Lect. Notes Ser., pp. 11–25, 1998.
  30. M. A. Gelbart, J. Snoek, and R. P. Adams, “Bayesian optimization with unknown constraints,” in UAI’14: Proc. of the 30th Conference on Uncertainty in Artificial Intelligence, 2014, pp. 250–259.
  31. J. R. Gardner, M. J. Kusner, Z. E. Xu, K. Q. Weinberger, and J. P. Cunningham, “Bayesian optimization with inequality constraints,” in ICML’14: Proc. of the 31nd International Conference on Machine Learning, vol. 2014, 2014, pp. 937–945.
  32. J. Bernardo, M. Bayarri, J. Berger, A. Dawid, D. Heckerman, A. Smith, and M. West, “Optimization under unknown constraints,” Bayesian Stat., vol. 9, no. 9, p. 229, 2011.
  33. V. Picheny, “A stepwise uncertainty reduction approach to constrained global optimization,” in AISTATS’14: Proc. of the 2014 International Conference on Artificial intelligence and statistics, 2014, pp. 787–795.
  34. J. M. Hernández-Lobato, M. W. Hoffman, and Z. Ghahramani, “Predictive entropy search for efficient global optimization of black-box functions,” in NIPS’14: Proc. of the 2014 Annual Conference on Neural Information Processing Systems, 2014, pp. 918–926.
  35. J. M. Hernández-Lobato, M. A. Gelbart, M. W. Hoffman, R. P. Adams, and Z. Ghahramani, “Predictive entropy search for bayesian optimization with unknown constraints,” in ICML’15: Proc. of the 32nd International Conference on Machine Learning, vol. 37, 2015, pp. 1699–1707.
  36. R. B. Gramacy, G. A. Gray, S. Le Digabel, H. K. Lee, P. Ranjan, G. Wells, and S. M. Wild, “Modeling an augmented lagrangian for blackbox constrained optimization,” Technometrics, vol. 58, no. 1, pp. 1–11, 2016.
  37. V. Picheny, R. B. Gramacy, S. M. Wild, and S. L. Digabel, “Bayesian optimization under mixed constraints with a slack-variable augmented lagrangian,” in NIPS’16: Proc. of the 2014 Annual Conference on Neural Information Processing Systems, 2016, pp. 1435–1443.
  38. S. Ariafar, J. Coll-Font, D. H. Brooks, and J. G. Dy, “ADMMBO: bayesian optimization with unknown constraints using ADMM,” J. Mach. Learn. Res., vol. 20, pp. 123:1–123:26, 2019.
  39. S. Wang and K. Li, “Constrained Bayesian optimization under partial observations: Balanced improvements and provable convergence,” in AAAI’24: Proc. of the Thirty-Eighth AAAI Conference on Artificial Intelligence, 2024, accepted for publication. [Online]. Available: https://arxiv.org/abs/2312.03212
  40. S. P. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein, “Distributed optimization and statistical learning via the alternating direction method of multipliers,” Found. Trends Mach. Learn., vol. 3, no. 1, pp. 1–122, 2011.
  41. Z. Fan, Y. Fang, W. Li, J. Lu, X. Cai, and C. Wei, “A comparative study of constrained multi-objective evolutionary algorithms on constrained multi-objective optimization problems,” in CEC’17: Proc. of the 2017 IEEE Congress on Evolutionary Computation, 2017, pp. 209–216.
  42. R. Tanabe and A. Oyama, “A note on constrained multi-objective optimization benchmark problems,” in CEC’17: Proc. of the 2017 IEEE Congress on Evolutionary Computation, 2017, pp. 1127–1134.
  43. F. Snyman and M. Helbig, “Solving constrained multi-objective optimization problems with evolutionary algorithms,” in ICSI’17: Proc. of the 2017 International Conference on Swarm Intelligence, 2017, pp. 57–66.
  44. J. Liang, X. Ban, K. Yu, B. Qu, K. Qiao, C. Yue, K. Chen, and K. C. Tan, “A survey on evolutionary constrained multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 27, no. 2, pp. 201–221, 2022.
  45. K. Li, S. Kwong, J. Cao, M. Li, J. Zheng, and R. Shen, “Achieving balance between proximity and diversity in multi-objective evolutionary algorithm,” Inf. Sci., vol. 182, no. 1, pp. 220–242, 2012.
  46. M. A. Jan and Q. Zhang, “MOEA/D for constrained multiobjective optimization: Some preliminary experimental results,” in UKCI’10: Proc. of the 2010 UK Workshop on Computational Intelligence, 2010, pp. 1–6.
  47. K. Li, S. Kwong, and K. Deb, “A dual-population paradigm for evolutionary multiobjective optimization,” Inf. Sci., vol. 309, pp. 50–72, 2015.
  48. R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “A reference vector guided evolutionary algorithm for many-objective optimization,” IEEE Trans. Evol. Comput., vol. 20, no. 5, pp. 773–791, 2016.
  49. K. Li, Á. Fialho, S. Kwong, and Q. Zhang, “Adaptive operator selection with bandits for a multiobjective evolutionary algorithm based on decomposition,” IEEE Trans. Evol. Comput., vol. 18, no. 1, pp. 114–130, 2014.
  50. K. Li, Q. Zhang, S. Kwong, M. Li, and R. Wang, “Stable matching-based selection in evolutionary multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 18, no. 6, pp. 909–923, 2014.
  51. M. Wu, S. Kwong, Q. Zhang, K. Li, R. Wang, and B. Liu, “Two-level stable matching-based selection in MOEA/D,” in SMC’15: Proc. of the 2015 IEEE International Conference on Systems, Man, and Cybernetics, 2015, pp. 1720–1725.
  52. K. Li, S. Kwong, Q. Zhang, and K. Deb, “Interrelationship-based selection for decomposition multiobjective optimization,” IEEE Trans. Cybern., vol. 45, no. 10, pp. 2076–2088, 2015.
  53. K. Li, K. Deb, Q. Zhang, and Q. Zhang, “Efficient nondomination level update method for steady-state evolutionary multiobjective optimization,” IEEE Trans. Cybernetics, vol. 47, no. 9, pp. 2838–2849, 2017.
  54. M. Wu, K. Li, S. Kwong, Y. Zhou, and Q. Zhang, “Matching-based selection with incomplete lists for decomposition multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 21, no. 4, pp. 554–568, 2017.
  55. M. Wu, K. Li, S. Kwong, and Q. Zhang, “Evolutionary many-objective optimization based on adversarial decomposition,” IEEE Trans. Cybern., vol. 50, no. 2, pp. 753–764, 2020.
  56. M. Wu, S. Kwong, Y. Jia, K. Li, and Q. Zhang, “Adaptive weights generation for decomposition-based multi-objective optimization using gaussian process regression,” in GECCO’17: Proc. of the 2017 Genetic and Evolutionary Computation Conference.   ACM, 2017, pp. 641–648.
  57. M. Wu, K. Li, S. Kwong, Q. Zhang, and J. Zhang, “Learning to decompose: A paradigm for decomposition-based multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 23, no. 3, pp. 376–390, 2019. [Online]. Available: https://doi.org/10.1109/TEVC.2018.2865931
  58. G. Pruvost, B. Derbel, A. Liefooghe, K. Li, and Q. Zhang, “On the combined impact of population size and sub-problem selection in MOEA/D,” in EvoCOP’20: Proc. of the 20th European Conference on Evolutionary Computation in Combinatorial Optimization, ser. Lecture Notes in Computer Science, vol. 12102.   Springer, 2020, pp. 131–147.
  59. S. Z. Martínez and C. A. C. Coello, “A multi-objective evolutionary algorithm based on decomposition for constrained multi-objective optimization,” in CEC’14: Proc. of the 2014 IEEE Congress on Evolutionary Computation, 2014, pp. 429–436.
  60. M. Asafuddoula, T. Ray, and R. A. Sarker, “A decomposition-based evolutionary algorithm for many objective optimization,” IEEE Trans. Evol. Comput., vol. 19, no. 3, pp. 445–460, 2015.
  61. W.-Q. Ying, W.-P. He, Y.-X. Huang, D.-T. Li, and Y. Wu, “An adaptive stochastic ranking mechanism in moea/d for constrained multi-objective optimization,” in ISAI’16: Proc. of the 2016 International Conference on Information System and Artificial Intelligence, 2016, pp. 514–518.
  62. Z. Liu, Y. Wang, and B. Wang, “Indicator-based constrained multiobjective evolutionary algorithms,” IEEE Trans. Syst. Man Cybern. Syst., vol. 51, no. 9, pp. 5414–5426, 2019.
  63. A. Angantyr, J. Andersson, and J. Aidanpaa, “Constrained optimization based on a multiobjective evolutionary algorithm,” in CEC’03: Proc. of the 2003 IEEE Congress on Evolutionary Computation, 2003, pp. 1560–1567.
  64. Y. G. Woldesenbet, G. G. Yen, and B. G. Tessema, “Constraint handling in multiobjective evolutionary optimization,” IEEE Trans. Evol. Comput., vol. 13, no. 3, pp. 514–525, 2009.
  65. A. E. Sorkhabi, M. D. Amiri, and A. R. Khanteymoori, “Duality evolution: an efficient approach to constraint handling in multi-objective particle swarm optimization,” Soft Comput., vol. 21, no. 24, pp. 7251–7267, 2017.
  66. X. Shan and K. Li, “An improved two-archive evolutionary algorithm for constrained multi-objective optimization,” in EMO’21: Proc. of the 11th International Conference on Evolutionary Multicriteria Optimization, ser. Lecture Notes in Computer Science, vol. 12654.   Springer, 2021, pp. 235–247.
  67. J. Wang, Y. Li, Q. Zhang, Z. Zhang, and S. Gao, “Cooperative multiobjective evolutionary algorithm with propulsive population for constrained multiobjective optimization,” IEEE Trans. Syst. Man. Cybern., vol. 52, no. 6, pp. 3476–3491, 2021.
  68. Z. Liu, B. Wang, and K. Tang, “Handling constrained multiobjective optimization problems via bidirectional coevolution,” IEEE Trans. Cybern., vol. 52, no. 10, pp. 10 163–10 176, 2021.
  69. F. Caraffini, F. Neri, and L. Picinali, “An analysis on separability for memetic computing automatic design,” Inf. Sci., vol. 265, pp. 1–22, 2014.
  70. S. Datta, A. Ghosh, K. Sanyal, and S. Das, “A radial boundary intersection aided interior point method for multi-objective optimization,” Inf. Sci., vol. 377, pp. 1–16, 2017.
  71. V. A. S. Hernández, O. Schütze, H. Wang, A. Deutz, and M. Emmerich, “The set-based hypervolume newton method for bi-objective optimization,” IEEE Trans. Cybern., vol. 50, no. 5, pp. 2186–2196, 2018.
  72. Z. Fan, W. Li, X. Cai, H. Li, C. Wei, Q. Zhang, K. Deb, and E. D. Goodman, “Push and pull search for solving constrained multi-objective optimization problems,” Swarm Evol. Comput., vol. 44, pp. 665–679, 2019.
  73. Y. Tian, Y. Zhang, Y. Su, X. Zhang, K. C. Tan, and Y. Jin, “Balancing objective optimization and constraint satisfaction in constrained evolutionary multiobjective optimization,” IEEE Trans. Cybern., vol. 52, no. 9, pp. 9559–9572, 2021.
  74. P. Singh, I. Couckuyt, F. Ferranti, and T. Dhaene, “A constrained multi-objective surrogate-based optimization algorithm,” in CEC’14: Proc. of 2014 IEEE Congress on Evolutionary Computation, 2014, pp. 3080–3087.
  75. R. Datta and R. G. Regis, “A surrogate-assisted evolution strategy for constrained multi-objective optimization,” Expert Syst., vol. 57, pp. 270–284, 2016.
  76. K. Li, Q. Zhang, S. Kwong, M. Li, and R. Wang, “Stable matching-based selection in evolutionary multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 18, no. 6, pp. 909–923, 2013.
  77. C. Bragalli, C. D’Ambrosio, J. Lee, A. Lodi, P. Toth et al., “Water network design by minlp,” IBM, Tech. Rep., 2008.
  78. Q. Zhu, Q. Zhang, and Q. Lin, “A constrained multiobjective evolutionary algorithm with detect-and-escape strategy,” IEEE Trans. Evol. Comput., vol. 24, no. 5, pp. 938–947, 2020.
  79. Z. Liu and Y. Wang, “Handling constrained multiobjective optimization problems with constraints in both the decision and objective spaces,” IEEE Trans. Evol. Comput., vol. 23, no. 5, pp. 870–884, 2019.
  80. F. Ming, W. Gong, D. Li, L. Wang, and L. Gao, “A competitive and cooperative swarm optimizer for constrained multi-objective optimization problems,” IEEE Transactions on Evolutionary Computation, 2022, accepted for publication.
  81. R. Storn and K. V. Price, “Differential evolution - A simple and efficient heuristic for global optimization over continuous spaces,” J. Glob. Optim., vol. 11, no. 4, pp. 341–359, 1997.
  82. P. A. N. Bosman and D. Thierens, “The balance between proximity and diversity in multiobjective evolutionary algorithms,” IEEE Trans. Evol. Comput., vol. 7, no. 2, pp. 174–188, 2003.
  83. H. Ishibuchi, H. Masuda, Y. Tanigaki, and Y. Nojima, “Modified distance calculation in generational distance and inverted generational distance,” in EMO’15: Proc. of the 8th International Conference on Evolutionary Multi-Criterion Optimization, vol. 9019, 2015, pp. 110–125.
  84. E. Zitzler and L. Thiele, “Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach,” IEEE Trans. Evol. Comput., vol. 3, no. 4, pp. 257–271, 1999.
  85. J. Derrac, S. García, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm Evol. Comput., vol. 1, no. 1, pp. 3–18, 2011.
  86. A. Vargha and H. D. Delaney, “A critique and improvement of the cl common language effect size statistics of mcgraw and wong,” J. Educ. Behav. Stat., vol. 25, no. 2, pp. 101–132, 2000.
  87. P. N. Williams, K. Li, and G. Min, “Large-scale evolutionary optimization via multi-task random grouping,” in SMC’21: Proc. of 2021 IEEE International Conference on Systems, Man, and Cybernetics.   IEEE, 2021, pp. 778–783.
  88. R. Chen, K. Li, and X. Yao, “Dynamic multiobjectives optimization with a changing number of objectives,” IEEE Trans. Evol. Comput., vol. 22, no. 1, pp. 157–171, 2018.
  89. X. Fan, K. Li, and K. C. Tan, “Surrogate assisted evolutionary algorithm based on transfer learning for dynamic expensive multi-objective optimisation problems,” in CEC’20: Proc. of the 2020 IEEE Congress on Evolutionary Computation.   IEEE, 2020, pp. 1–8.
  90. R. Chen and K. Li, “Transfer bayesian optimization for expensive black-box optimization in dynamic environment,” in SMC’21: Proc. of 2021 IEEE International Conference on Systems, Man, and Cybernetics.   IEEE, 2021, pp. 1374–1379.
  91. K. Li, R. Chen, and X. Yao, “A data-driven evolutionary transfer optimization for expensive problems in dynamic environments,” IEEE Trans. Evol. Comput., 2023.
  92. J. Cao, S. Kwong, R. Wang, and K. Li, “AN indicator-based selection multi-objective evolutionary algorithm with preference for multi-class ensemble,” in ICMLC’14: Proc. of the 2014 International Conference on Machine Learning and Cybernetics, 2014, pp. 147–152.
  93. K. Li, K. Deb, O. T. Altinöz, and X. Yao, “Empirical investigations of reference point based methods when facing a massively large number of objectives: First results,” in EMO’17: Proc. of the 9th International Conference Evolutionary Multi-Criterion Optimization, 2017, pp. 390–405.
  94. K. Li, R. Chen, G. Min, and X. Yao, “Integration of preferences in decomposition multiobjective optimization,” IEEE Trans. Cybern., vol. 48, no. 12, pp. 3359–3370, 2018.
  95. J. Zou, C. Ji, S. Yang, Y. Zhang, J. Zheng, and K. Li, “A knee-point-based evolutionary algorithm using weighted subpopulation for many-objective optimization,” Swarm and Evolutionary Computation, vol. 47, pp. 33–43, 2019.
  96. H. Nie, H. Gao, and K. Li, “Knee point identification based on voronoi diagram,” in SMC’20: Proc. of 2020 IEEE International Conference on Systems, Man, and Cybernetics.   IEEE, 2020, pp. 1081–1086.
  97. R. Chen and K. Li, “Knee point identification based on the geometric characteristic,” in SMC’21: Proc. of 2021 IEEE International Conference on Systems, Man, and Cybernetics.   IEEE, 2021, pp. 764–769.
  98. K. Li, H. Nie, H. Gao, and X. Yao, “Posterior decision making based on decomposition-driven knee point identification,” IEEE Trans. Evol. Comput., vol. 26, no. 6, pp. 1409–1423, 2022.
  99. K. Li, “Progressive preference learning: Proof-of-principle results in MOEA/D,” in EMO’19: Proc. of the 10th International Conference Evolutionary Multi-Criterion Optimization, 2019, pp. 631–643.
  100. K. Li, R. Chen, D. A. Savic, and X. Yao, “Interactive decomposition multiobjective optimization via progressively learned value functions,” IEEE Trans. Fuzzy Syst., vol. 27, no. 5, pp. 849–860, 2019.
  101. K. Li, G. Lai, and X. Yao, “Interactive evolutionary multiobjective optimization via learning to rank,” IEEE Trans. Evol. Comput., vol. 27, no. 4, pp. 749–763, 2023.
  102. K. Li, K. Deb, and X. Yao, “R-metric: Evaluating the performance of preference-based evolutionary multiobjective optimization using reference points,” IEEE Trans. Evol. Comput., vol. 22, no. 6, pp. 821–835, 2018.
  103. G. Lai, M. Liao, and K. Li, “Empirical studies on the role of the decision maker in interactive evolutionary multi-objective optimization,” in CEC’21: Proc. of the 2021 IEEE Congress on Evolutionary Computation.   IEEE, 2021, pp. 185–192.
  104. R. Tanabe and K. Li, “Quality indicators for preference-based evolutionary multi-objective optimization using a reference point: A review and analysis,” IEEE Trans. Evol. Comput., 2023, accepted for publication.
  105. K. Li, M. Liao, K. Deb, G. Min, and X. Yao, “Does preference always help? A holistic study on preference-based evolutionary multiobjective optimization using reference points,” IEEE Trans. Evol. Comput., vol. 24, no. 6, pp. 1078–1096, 2020.
  106. H. Yang and K. Li, “Instoptima: Evolutionary multi-objective instruction optimization via large language model-based instruction operators,” in EMNLP’23: Findings of the Association for Computational Linguistics: EMNLP 2023.   Association for Computational Linguistics, 2023, pp. 13 593–13 602.
  107. R. Chen and K. Li, “Data-driven evolutionary multi-objective optimization based on multiple-gradient descent for disconnected pareto fronts,” in EMO’23: Proc. of the 12th International Conference on Evolutionary Multi-Criterion Optimization, ser. Lecture Notes in Computer Science, vol. 13970.   Springer, 2023, pp. 56–70.
  108. B. Lyu, Y. Yang, S. Wen, T. Huang, and K. Li, “Neural architecture search for portrait parsing,” IEEE Trans. Neural Networks Learn. Syst., vol. 34, no. 3, pp. 1112–1121, 2023.
  109. B. Lyu, M. Hamdi, Y. Yang, Y. Cao, Z. Yan, K. Li, S. Wen, and T. Huang, “Efficient spectral graph convolutional network deployment on memristive crossbars,” IEEE Trans. Emerg. Top. Comput. Intell., vol. 7, no. 2, pp. 415–425, 2023.
  110. B. Lyu, L. Lu, M. Hamdi, S. Wen, Y. Yang, and K. Li, “MTLP-JR: multi-task learning-based prediction for joint ranking in neural architecture search,” Comput. Electr. Eng., vol. 105, p. 108474, 2023.
  111. S. Zhou, K. Li, and G. Min, “Attention-based genetic algorithm for adversarial attack in natural language processing,” in PPSN’22: Proc. of 17th International Conference on Parallel Problem Solving from Nature, ser. Lecture Notes in Computer Science, vol. 13398.   Springer, 2022, pp. 341–355.
  112. ——, “Adversarial example generation via genetic algorithm: a preliminary result,” in GECCO’22: Companion of 2022 Genetic and Evolutionary Computation Conference.   ACM, 2022, pp. 469–470.
  113. P. N. Williams, K. Li, and G. Min, “A surrogate assisted evolutionary strategy for image approximation by density-ratio estimation,” in CEC’23: Proc. of 2023 IEEE Congress on Evolutionary Computation.   IEEE, 2023, pp. 1–8.
  114. ——, “Sparse adversarial attack via bi-objective optimization,” in EMO’23: Proc. of the 12th International Conference on Evolutionary Multi-Criterion Optimization, ser. Lecture Notes in Computer Science, vol. 13970.   Springer, 2023, pp. 118–133. [Online]. Available: https://doi.org/10.1007/978-3-031-27250-9_9
  115. ——, “Black-box adversarial attack via overlapped shapes,” in GECCO ’22: Genetic and Evolutionary Computation Conference, Companion Volume, Boston, Massachusetts, USA, July 9 - 13, 2022.   ACM, 2022, pp. 467–468.
  116. P. N. Williams and K. Li, “Black-box sparse adversarial attack via multi-objective optimisation CVPR proceedings,” in CVPR’23: Proc. of 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition.   IEEE, 2023, pp. 12 291–12 301.
  117. K. Li, Z. Xiang, and K. C. Tan, “Which surrogate works for empirical performance modelling? A case study with differential evolution,” in CEC’19: Proc. of the 2019 IEEE Congress on Evolutionary Computation, 2019, pp. 1988–1995.
  118. K. Li, Z. Xiang, T. Chen, S. Wang, and K. C. Tan, “Understanding the automated parameter optimization on transfer learning for cross-project defect prediction: an empirical study,” in ICSE’20: Proc. of the 42nd International Conference on Software Engineering.   ACM, 2020, pp. 566–577.
  119. M. Liu, K. Li, and T. Chen, “DeepSQLi: deep semantic learning for testing SQL injection,” in ISSTA’20: Proc. of the 29th ACM SIGSOFT International Symposium on Software Testing and Analysis.   ACM, 2020, pp. 286–297.
  120. K. Li, Z. Xiang, T. Chen, and K. C. Tan, “BiLO-CPDP: Bi-level programming for automated model discovery in cross-project defect prediction,” in ASE’20: Proc. of the 35th IEEE/ACM International Conference on Automated Software Engineering.   IEEE, 2020, pp. 573–584.
  121. K. Li, H. Yang, and W. Visser, “Danuoyi: Evolutionary multi-task injection testing on web application firewalls,” IEEE Trans. Softw. Eng., 2023, accepted for publication.
  122. J. Xu, K. Li, M. Abusara, and Y. Zhang, “Admm-based OPF problem against cyber attacks in smart grid,” in SMC’21: Proc. of 2021 IEEE International Conference on Systems, Man, and Cybernetics.   IEEE, 2021, pp. 1418–1423.
  123. J. Xu, K. Li, and M. Abusara, “Multi-objective reinforcement learning based multi-microgrid system optimisation problem,” in EMO’21: Proc. of 11th International Conference on Evolutionary Multi-Criterion Optimization, vol. 12654.   Springer, 2021, pp. 684–696.
  124. ——, “Preference based multi-objective reinforcement learning for multi-microgrid system optimization problem in smart grid,” Memetic Comput., vol. 14, no. 2, pp. 225–235, 2022.
  125. J. Billingsley, K. Li, W. Miao, G. Min, and N. Georgalas, “A formal model for multi-objective optimisation of network function virtualisation placement,” in EMO’19: Proc. of 10th International Conference on Evolutionary Multi-Criterion Optimization, ser. Lecture Notes in Computer Science, vol. 11411.   Springer, 2019, pp. 529–540.
  126. ——, “Routing-led placement of vnfs in arbitrary networks,” in CEC’20: Proc. of 2020 IEEE Congress on Evolutionary Computation.   IEEE, 2020, pp. 1–8.
  127. J. Billingsley, W. Miao, K. Li, G. Min, and N. Georgalas, “Performance analysis of SDN and NFV enabled mobile cloud computing,” in GLOBECOM’20: Proc. of 2020 IEEE Global Communications Conference.   IEEE, 2020, pp. 1–6.
  128. J. Billingsley, K. Li, W. Miao, G. Min, and N. Georgalas, “Parallel algorithms for the multiobjective virtual network function placement problem,” in EMO’21: Proc. of 11th International Conference on Evolutionary Multi-Criterion Optimization, ser. Lecture Notes in Computer Science, vol. 12654.   Springer, 2021, pp. 708–720.
  129. J. Cao, S. Kwong, R. Wang, and K. Li, “A weighted voting method using minimum square error based on extreme learning machine,” in ICMLC’12: Proc. of the 2012 International Conference on Machine Learning and Cybernetics, 2012, pp. 411–414.
  130. K. Li, R. Wang, S. Kwong, and J. Cao, “Evolving extreme learning machine paradigm with adaptive operator selection and parameter control,” International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. supp02, pp. 143–154, 2013.
  131. K. Li and S. Kwong, “A general framework for evolutionary multiobjective optimization via manifold learning,” Neurocomputing, vol. 146, pp. 65–74, 2014.
  132. J. Cao, S. Kwong, R. Wang, X. Li, K. Li, and X. Kong, “Class-specific soft voting based multiple extreme learning machines ensemble,” Neurocomputing, vol. 149, pp. 275–284, 2015.
  133. R. Wang, S. Ye, K. Li, and S. Kwong, “Bayesian network based label correlation analysis for multi-label classifier chain,” Inf. Sci., vol. 554, pp. 256–275, 2021.
  134. H. Gao, H. Nie, and K. Li, “Visualisation of pareto front approximation: A short survey and empirical comparisons,” in CEC’19: Proc. of the 2019 IEEE Congress on Evolutionary Computation, 2019, pp. 1750–1757.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Shuang Li (203 papers)
  2. Ke Li (723 papers)
  3. Wei Li (1122 papers)
  4. Ming Yang (289 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.