Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficiently Tackling Million-Dimensional Multiobjective Problems: A Direction Sampling and Fine-Tuning Approach (2304.04067v2)

Published 8 Apr 2023 in cs.NE and cs.AI

Abstract: We define very large-scale multiobjective optimization problems as optimizing multiple objectives (VLSMOPs) with more than 100,000 decision variables. These problems hold substantial significance, given the ubiquity of real-world scenarios necessitating the optimization of hundreds of thousands, if not millions, of variables. However, the larger dimension in VLSMOPs intensifies the curse of dimensionality and poses significant challenges for existing large-scale evolutionary multiobjective algorithms, rendering them more difficult to solve within the constraints of practical computing resources. To overcome this issue, we propose a novel approach called the very large-scale multiobjective optimization framework (VMOF). The method efficiently samples general yet suitable evolutionary directions in the very large-scale space and subsequently fine-tunes these directions to locate the Pareto-optimal solutions. To sample the most suitable evolutionary directions for different solutions, Thompson sampling is adopted for its effectiveness in recommending from a very large number of items within limited historical evaluations. Furthermore, a technique is designed for fine-tuning directions specific to tracking Pareto-optimal solutions. To understand the designed framework, we present our analysis of the framework and then evaluate VMOF using widely recognized benchmarks and real-world problems spanning dimensions from 100 to 1,000,000. Experimental results demonstrate that our method exhibits superior performance not only on LSMOPs but also on VLSMOPs when compared to existing algorithms.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: Nsga-ii,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 2, pp. 182–197, 2002.
  2. V. A. Shim, K. C. Tan, and H. Tang, “Adaptive memetic computing for evolutionary multiobjective optimization,” IEEE Transactions on Cybernetics, vol. 45, no. 4, pp. 610–621, 2015.
  3. Y. Tian, L. Si, X. Zhang, R. Cheng, C. He, K. c. Tan, and Y. Jin, “Evolutionary large-scale multi-objective optimization: A survey,” ACM Computing Surveys, vol. 1, no. 1, 2021.
  4. W.-J. Hong, P. Yang, and K. Tang, “Evolutionary computation for large-scale multi-objective optimization: A decade of progresses,” International Journal of Automation and Computing, pp. 1–15, 2021.
  5. Y. Tian, C. Lu, X. Zhang, K. C. Tan, and Y. Jin, “Solving large-scale multiobjective optimization problems with sparse optimal solutions via unsupervised neural networks,” IEEE Transactions on Cybernetics, vol. 51, no. 6, pp. 3115–3128, 2021.
  6. Z. Yang, K. Tang, and X. Yao, “Large scale evolutionary optimization using cooperative coevolution,” Information Sciences, vol. 178, no. 15, pp. 2985–2999, 2008.
  7. Y. Tian, X. Zhang, C. Wang, and Y. Jin, “An evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 2, pp. 380–393, 2020.
  8. C. Qian, “Distributed pareto optimization for large-scale noisy subset selection,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 4, pp. 694–707, Aug 2020.
  9. H. Hong, M. Jiang, L. Feng, Q. Lin, and K. C. Tan, “Balancing exploration and exploitation for solving large-scale multiobjective optimization via attention mechanism,” in 2022 IEEE Congress on Evolutionary Computation (CEC), 2022, pp. 1–8.
  10. L. M. Antonio and C. A. C. Coello, “Use of cooperative coevolution for solving large scale multiobjective optimization problems,” in 2013 IEEE Congress on Evolutionary Computation, 2013, pp. 2758–2765.
  11. H. Zille, H. Ishibuchi, S. Mostaghim, and Y. Nojima, “A framework for large-scale multiobjective optimization based on problem transformation,” IEEE Transactions on Evolutionary Computation, vol. 22, no. 2, pp. 260–275, 2018.
  12. Y. Tian, X. Zheng, X. Zhang, and Y. Jin, “Efficient large-scale multiobjective optimization based on a competitive swarm optimizer,” IEEE Transactions on Cybernetics, vol. 50, no. 8, pp. 3696–3708, 2020.
  13. C. He, R. Cheng, C. Zhang, Y. Tian, Q. Chen, and X. Yao, “Evolutionary large-scale multiobjective optimization for ratio error estimation of voltage transformers,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 5, pp. 868–881, 2020.
  14. D. Donoho, “High-dimensional data analysis: The curses and blessings of dimensionality,” AMS Math Challenges Lecture, pp. 1–32, 01 2000.
  15. J. Bobadilla, F. Ortega, A. Hernando, and A. Gutiérrez, “Recommender systems survey,” Knowledge-Based Systems, vol. 46, pp. 109–132, 2013. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0950705113001044
  16. S. Li, W. Lei, Q. Wu, X. He, P. Jiang, and T.-S. Chua, “Seamlessly unifying attributes and items: Conversational recommendation for cold-start users,” ACM Trans. Inf. Syst., vol. 39, no. 4, aug 2021.
  17. L. M. Antonio, C. A. C. Coello, S. G. Brambila, J. F. González, and G. C. Tapia, “Operational decomposition for large scale multi-objective optimization problems,” in Proceedings of the Genetic and Evolutionary Computation Conference Companion, ser. GECCO ’19.   New York, NY, USA: Association for Computing Machinery, 2019, p. 225–226.
  18. F. Sander, H. Zille, and S. Mostaghim, “Transfer strategies from single- to multi-objective grouping mechanisms,” in Proceedings of the Genetic and Evolutionary Computation Conference, ser. GECCO ’18.   New York, NY, USA: Association for Computing Machinery, 2018, p. 729–736.
  19. H. Chen, X. Zhu, W. Pedrycz, S. Yin, G. Wu, and H. Yan, “Pea: Parallel evolutionary algorithm by separating convergence and diversity for large-scale multi-objective optimization,” in 2018 IEEE 38th International Conference on Distributed Computing Systems (ICDCS), 2018, pp. 223–232.
  20. X. Ma, F. Liu, Y. Qi, X. Wang, L. Li, L. Jiao, M. Yin, and M. Gong, “A multiobjective evolutionary algorithm based on decision variable analyses for multiobjective optimization problems with large-scale variables,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 2, pp. 275–298, 2016.
  21. X. Zhang, Y. Tian, R. Cheng, and Y. Jin, “A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 22, no. 1, pp. 97–112, 2018.
  22. H. Qian and Y. Yu, “Solving high-dimensional multi-objective optimization problems with low effective dimensions,” in Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, ser. AAAI’17.   AAAI Press, 2017, p. 875–881.
  23. R. Liu, R. Ren, J. Liu, and J. Liu, “A clustering and dimensionality reduction based evolutionary algorithm for large-scale multi-objective problems,” Applied Soft Computing, vol. 89, p. 106120, 2020.
  24. C. He, L. Li, Y. Tian, X. Zhang, R. Cheng, Y. Jin, and X. Yao, “Accelerating large-scale multiobjective optimization via problem reformulation,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 6, pp. 949–961, 2019.
  25. S. Liu, J. Li, Q. Lin, Y. Tian, and K. C. Tan, “Learning to accelerate evolutionary search for large-scale multiobjective optimization,” IEEE Transactions on Evolutionary Computation, pp. 1–1, 2022.
  26. J.-H. Yi, L.-N. Xing, G.-G. Wang, J. Dong, A. V. Vasilakos, A. H. Alavi, and L. Wang, “Behavior of crossover operators in nsga-iii for large-scale optimization problems,” Information Sciences, vol. 509, pp. 470–487, 2020.
  27. C. He, R. Cheng, and D. Yazdani, “Adaptive offspring generation for evolutionary large-scale multiobjective optimization,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, pp. 1–13, 2020.
  28. W. Hong, K. Tang, A. Zhou, H. Ishibuchi, and X. Yao, “A scalable indicator-based evolutionary algorithm for large-scale multiobjective optimization,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 3, pp. 525–537, 2019.
  29. Z. Wang, H. Hong, K. Ye, G.-E. Zhang, M. Jiang, and K. C. Tan, “Manifold interpolation for large-scale multiobjective optimization via generative adversarial networks,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1–15, 2021.
  30. S. Qin, C. Sun, Y. Jin, Y. Tan, and J. Fieldsend, “Large-scale evolutionary multiobjective optimization assisted by directed sampling,” IEEE Transactions on Evolutionary Computation, vol. 25, no. 4, pp. 724–738, 2021.
  31. Y. Tian, Y. Feng, X. Zhang, and C. Sun, “A fast clustering based evolutionary algorithm for super-large-scale sparse multi-objective optimization,” IEEE/CAA Journal of Automatica Sinica, pp. 1–16, 2022.
  32. L. Li, C. He, R. Cheng, H. Li, L. Pan, and Y. Jin, “A fast sampling based evolutionary algorithm for million-dimensional multiobjective optimization,” Swarm and Evolutionary Computation, vol. 75, p. 101181, 2022.
  33. O. Chapelle and L. Li, “An empirical evaluation of thompson sampling,” in Advances in Neural Information Processing Systems, J. Shawe-Taylor, R. Zemel, P. Bartlett, F. Pereira, and K. Weinberger, Eds., vol. 24.   Curran Associates, Inc., 2011.
  34. W. R. Thompson, “On the likelihood that one unknown probability exceeds another in view of the evidence of two samples,” Biometrika, vol. 25, no. 3/4, pp. 285–294, 1933.
  35. W.  R.  Thompson, “On the theory of apportionment,” American Journal of Mathematics, vol. 57, no. 2, pp. 450–456, 1935.
  36. D. J. Russo, B. Van Roy, A. Kazerouni, I. Osband, and Z. Wen, “A tutorial on thompson sampling,” Found. Trends Mach. Learn., vol. 11, no. 1, p. 1–96, jul 2018.
  37. D. Agarwal, B. Long, J. Traupman, D. Xin, and L. Zhang, “Laser: A scalable response prediction platform for online advertising,” in Proceedings of the 7th ACM International Conference on Web Search and Data Mining, ser. WSDM ’14.   New York, NY, USA: Association for Computing Machinery, 2014, p. 173–182.
  38. J. Kawale, H. Bui, B. Kveton, L. T. Thanh, and S. Chawla, “Efficient thompson sampling for online matrix-factorization recommendation,” in Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 1, ser. NIPS’15, 2015, p. 1297–1305.
  39. K. Kandasamy, A. Krishnamurthy, J. Schneider, and B. Poczos, “Parallelised bayesian optimisation via thompson sampling,” in Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, ser. Proceedings of Machine Learning Research, A. Storkey and F. Perez-Cruz, Eds., vol. 84.   PMLR, 09–11 Apr 2018, pp. 133–142.
  40. L. Sun and K. Li, “Adaptive operator selection based on dynamic thompson sampling for moea/d,” in Parallel Problem Solving from Nature – PPSN XVI, T. Bäck, M. Preuss, A. Deutz, H. Wang, C. Doerr, M. Emmerich, and H. Trautmann, Eds.   Cham: Springer International Publishing, 2020, pp. 271–284.
  41. S. Chakraborty, S. Roy, and A. Tewari, “Thompson sampling for high-dimensional sparse linear contextual bandits,” in Proceedings of the 40th International Conference on Machine Learning, ser. Proceedings of Machine Learning Research, A. Krause, E. Brunskill, K. Cho, B. Engelhardt, S. Sabato, and J. Scarlett, Eds., vol. 202.   PMLR, 23–29 Jul 2023, pp. 3979–4008. [Online]. Available: https://proceedings.mlr.press/v202/chakraborty23b.html
  42. N. Gupta, O.-C. Granmo, and A. Agrawala, “Thompson sampling for dynamic multi-armed bandits,” in 2011 10th International Conference on Machine Learning and Applications and Workshops, vol. 1, 2011, pp. 484–489.
  43. E. Kaufmann, N. Korda, and R. Munos, “Thompson sampling: An asymptotically optimal finite-time analysis,” in Algorithmic Learning Theory, N. H. Bshouty, G. Stoltz, N. Vayatis, and T. Zeugmann, Eds.   Berlin, Heidelberg: Springer Berlin Heidelberg, 2012, pp. 199–213.
  44. C. He, S. Huang, R. Cheng, K. C. Tan, and Y. Jin, “Evolutionary multiobjective optimization driven by generative adversarial networks (gans),” IEEE Transactions on Cybernetics, vol. 51, no. 6, pp. 3129–3142, 2021.
  45. Y. Tian, R. Cheng, X. Zhang, and Y. Jin, “Platemo: A matlab platform for evolutionary multi-objective optimization [educational forum],” IEEE Computational Intelligence Magazine, vol. 12, no. 4, pp. 73–87, 2017.
  46. R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “Test problems for large-scale multiobjective and many-objective optimization,” IEEE Transactions on Cybernetics, vol. 47, no. 12, pp. 4108–4121, 2017.
  47. E. Zitzler, L. Thiele, M. Laumanns, C. M. Fonseca, and V. G. d. Fonseca, “Performance assessment of multiobjective optimizers: an analysis and review,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 2, pp. 117–132, 2003.
  48. L. While, P. Hingston, L. Barone, and S. Huband, “A faster algorithm for calculating hypervolume,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 1, pp. 29–38, 2006.
  49. H. Gu, H. Wang, and Y. Jin, “Effects of pareto set on the performance of problem reformulation-based large-scale multiobjective optimization algorithms,” in 2023 IEEE Congress on Evolutionary Computation (CEC), 2023, pp. 1–8.
  50. M. Jiang, Z. Wang, L. Qiu, S. Guo, X. Gao, and K. C. Tan, “A fast dynamic evolutionary multiobjective algorithm via manifold transfer learning,” IEEE Transactions on Cybernetics, vol. 51, no. 7, pp. 3417–3428, 2021.
  51. M. Jiang, Z. Wang, S. Guo, X. Gao, and K. C. Tan, “Individual-based transfer learning for dynamic multiobjective optimization,” IEEE Transactions on Cybernetics, vol. 51, no. 10, pp. 4968–4981, 2021.
  52. J. Ang, K. Tan, and A. Mamun, “An evolutionary memetic algorithm for rule extraction,” Expert Systems with Applications, vol. 37, no. 2, pp. 1302–1315, 2010. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0957417409005545
  53. X. Qiu, J.-X. Xu, Y. Xu, and K. C. Tan, “A new differential evolution algorithm for minimax optimization in robust design,” IEEE Transactions on Cybernetics, vol. 48, no. 5, pp. 1355–1368, 2018.

Summary

We haven't generated a summary for this paper yet.