Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep-ELA: Deep Exploratory Landscape Analysis with Self-Supervised Pretrained Transformers for Single- and Multi-Objective Continuous Optimization Problems (2401.01192v2)

Published 2 Jan 2024 in cs.LG

Abstract: In many recent works, the potential of Exploratory Landscape Analysis (ELA) features to numerically characterize, in particular, single-objective continuous optimization problems has been demonstrated. These numerical features provide the input for all kinds of machine learning tasks on continuous optimization problems, ranging, i.a., from High-level Property Prediction to Automated Algorithm Selection and Automated Algorithm Configuration. Without ELA features, analyzing and understanding the characteristics of single-objective continuous optimization problems is -- to the best of our knowledge -- very limited. Yet, despite their usefulness, as demonstrated in several past works, ELA features suffer from several drawbacks. These include, in particular, (1.) a strong correlation between multiple features, as well as (2.) its very limited applicability to multi-objective continuous optimization problems. As a remedy, recent works proposed deep learning-based approaches as alternatives to ELA. In these works, e.g., point-cloud transformers were used to characterize an optimization problem's fitness landscape. However, these approaches require a large amount of labeled training data. Within this work, we propose a hybrid approach, Deep-ELA, which combines (the benefits of) deep learning and ELA features. Specifically, we pre-trained four transformers on millions of randomly generated optimization problems to learn deep representations of the landscapes of continuous single- and multi-objective optimization problems. Our proposed framework can either be used out-of-the-box for analyzing single- and multi-objective continuous optimization problems, or subsequently fine-tuned to various tasks focussing on algorithm behavior and problem understanding.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (58)
  1. Algorithm Selection Using Deep Learning without Feature Extraction. In Proceedings of the Genetic and Evolutionary Computation Conference. ACM.
  2. Performance evaluation of an advanced local search evolutionary algorithm. In 2005 IEEE congress on evolutionary computation, volume 2, pages 1777–1784. IEEE.
  3. Layer normalization. arXiv preprint arXiv:1607.06450.
  4. Sms-emoa: Multiobjective selection based on dominated hypervolume. European Journal of Operational Research, 181(3):1653–1669.
  5. Breiman, L. (2001). Random Forests. Machine learning, 45:5–32.
  6. Using Well-Understood Single-Objective Functions in Multiobjective Black-Box Optimization Test Suites. Evolutionary Computation, 30(2):165–193.
  7. An empirical study of training self-supervised vision transformers. arXiv preprint arXiv:2104.02057.
  8. Support-Vector Networks. Machine learning, 20:273–297.
  9. Nearest Neighbor Pattern Classification. IEEE transactions on information theory, 13(1):21–27.
  10. Language modeling with gated convolutional networks. In International conference on machine learning, pages 933–941. PMLR.
  11. A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE transactions on evolutionary computation, 6(2):182–197.
  12. Scalable Test Problems for Evolutionary Multiobjective Optimization. In Abraham, A., Jain, L., and Goldberg, R., editors, Evolutionary Multiobjective Optimization, Advanced Information and Knowledge Processing (AI & KP), pages 105 – 145. Springer.
  13. Omni-optimizer: A procedure for single and multi-objective optimization. In International conference on evolutionary multi-criterion optimization, pages 47–61. Springer.
  14. Linear Matrix Factorization Embeddings for Single-objective Optimization Landscapes. In 2020 IEEE Symposium Series on Computational Intelligence (SSCI), pages 775 – 782. IEEE.
  15. Bootstrap your own latent-a new approach to self-supervised learning. Advances in neural information processing systems, 33:21271–21284.
  16. Multimodality in multi-objective optimization–more boon than bane? In Proceedings of the 10th International Conference on Evolutionary Multi-Criterion Optimization, pages 126–138. Springer.
  17. PCT: Point Cloud Transformer. Computational Visual Media, 7(2):187–199.
  18. COmparing Continuous Optimizers: numbbo/COCO on Github.
  19. Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. Research Report RR-6829, INRIA.
  20. Long short-term memory. Neural computation, 9(8):1735–1780.
  21. A survey of automatic parameter tuning methods for metaheuristics. IEEE transactions on evolutionary computation, 24(2):201–216.
  22. Paramils: an automatic algorithm configuration framework. Journal of artificial intelligence research, 36:267–306.
  23. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In International conference on machine learning, pages 448–456. pmlr.
  24. Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In ICGA, volume 95, pages 184–192.
  25. Automated algorithm selection: Survey and perspectives. Evolutionary computation, 27(1):3–45.
  26. Exploratory landscape analysis. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) Companion, pages 990 – 1007.
  27. Detecting funnel structures by means of exploratory landscape analysis. In Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pages 265–272.
  28. The R-Package FLACCO for Exploratory Landscape Analysis with Applications to Multi-Objective Optimization Problems. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), pages 5262 – 5269. IEEE.
  29. Comprehensive feature-based landscape analysis of continuous and constrained optimization problems using the r-package flacco. Applications in Statistical Computing: From Music Data Analysis to Industrial Quality Improvement, pages 93–123.
  30. Convolutional Networks for Images, Speech, and Time Series. The Handbook of Brain Theory and Neural Networks, 3361(10):1995.
  31. Smac3: A versatile bayesian optimization package for hyperparameter optimization. Journal of Machine Learning Research, 23(54):1 – 9.
  32. The irace package: Iterated racing for automatic algorithm configuration. Operations Research Perspectives, 3:43 – 58.
  33. The dispersion metric and the cma evolution strategy. In Proceedings of the 8th annual conference on Genetic and evolutionary computation, pages 477–484.
  34. A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21.
  35. Exploratory landscape analysis. In Proceedings of the 13th annual conference on Genetic and evolutionary computation, pages 829–836.
  36. Benchmarking evolutionary algorithms: Towards exploratory landscape analysis. In Parallel Problem Solving from Nature, PPSN XI: 11th International Conference, Kraków, Poland, September 11-15, 2010, Proceedings, Part I 11, pages 73–82. Springer.
  37. Exploratory landscape analysis of continuous space optimization problems using information content. IEEE transactions on evolutionary computation, 19(1):74–87.
  38. Transformers without tears: Improving the normalization of self-attention. arXiv preprint arXiv:1910.05895.
  39. Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748.
  40. Towards feature-free automated algorithm selection for single-objective continuous black-box optimization. In 2021 IEEE Symposium Series on Computational Intelligence (SSCI). IEEE.
  41. Automated algorithm selection in single-objective continuous optimization: A comparative study of deep learning and landscape analysis methods. In Parallel Problem Solving from Nature–PPSN XVII: 17th International Conference, PPSN 2022. Springer.
  42. Nullifying the inherent bias of non-invariant exploratory landscape analysis features. In International Conference on the Applications of Evolutionary Computation (Part of EvoStar), pages 411–425. Springer.
  43. Exploratory Landscape Analysis is Strongly Sensitive to the Sampling Strategy. pages 139 – 153.
  44. Expressiveness and Robustness of Landscape Features. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) Companion, pages 2048 – 2051. ACM.
  45. Rice, J. R. (1976). The Algorithm Selection Problem. Advances in Computers, 15.
  46. On the potential of automated algorithm configuration on multi-modal multi-objective optimization problems. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, pages 356–359.
  47. A survey of methods for automated algorithm configuration. Journal of Artificial Intelligence Research, 75:425 – 487.
  48. Schäpermeier, L. (2022). An R Package Implementing the Multi-Objective Landscape Explorer (MOLE).
  49. Deep learning as a competitive feature-free approach for automated algorithm selection on the traveling salesperson problem. In Parallel Problem Solving from Nature–PPSN XVI: 16th International Conference, PPSN 2020. Springer.
  50. A collection of deep learning-based feature-free approaches for characterizing single-objective continuous fitness landscapes. In Proceedings of the Genetic and Evolutionary Computation Conference.
  51. Sobol’, I. M. (1967). On the distribution of points in a cube and the approximate evaluation of integrals. Zhurnal Vychislitel’noi Matematiki i Matematicheskoi Fiziki, 7(4):784–802.
  52. A recommender system for metaheuristic algorithms for continuous optimization based on deep recurrent neural networks. IEEE transactions on artificial intelligence, 1(1):5–18.
  53. Doe2vec: Deep-learning based features for exploratory landscape analysis. arXiv preprint arXiv:2304.01219.
  54. Attention is all you need. Advances in neural information processing systems, 30.
  55. Tools for landscape analysis of optimisation problems in procedural content generation for games. Applied Soft Computing, 136:110121.
  56. Hypervolume indicator gradient ascent multi-objective optimization. In Evolutionary Multi-Criterion Optimization: 9th International Conference, EMO 2017, Münster, Germany, March 19-22, 2017, Proceedings 9, pages 654–669. Springer.
  57. Moea/d: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on evolutionary computation, 11(6):712–731.
  58. Comparison of Multiobjective Evolutionary Algorithms: Empirical Results. Evolutionary Computation (ECJ), (2):173 – 195.
Citations (6)

Summary

We haven't generated a summary for this paper yet.