Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exploratory Landscape Analysis for Mixed-Variable Problems (2402.16467v1)

Published 26 Feb 2024 in cs.NE

Abstract: Exploratory landscape analysis and fitness landscape analysis in general have been pivotal in facilitating problem understanding, algorithm design and endeavors such as automated algorithm selection and configuration. These techniques have largely been limited to search spaces of a single domain. In this work, we provide the means to compute exploratory landscape features for mixed-variable problems where the decision space is a mixture of continuous, binary, integer, and categorical variables. This is achieved by utilizing existing encoding techniques originating from machine learning. We provide a comprehensive juxtaposition of the results based on these different techniques. To further highlight their merit for practical applications, we design and conduct an automated algorithm selection study based on a hyperparameter optimization benchmark suite. We derive a meaningful compartmentalization of these benchmark problems by clustering based on the used landscape features. The identified clusters mimic the behavior the used algorithms exhibit. Meaning, the different clusters have different best performing algorithms. Finally, our trained algorithm selector is able to close the gap between the single best and the virtual best solver by 57.5% over all benchmark problems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. O. Mersmann, B. Bischl, H. Trautmann, M. Preuss, C. Weihs, and G. Rudolph, “Exploratory Landscape Analysis,” in Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, ser. GECCO ’11.   New York, NY, USA: Association for Computing Machinery, 2011, p. 829–836.
  2. P. Kerschke, L. Kotthoff, J. Bossek, H. H. Hoos, and H. Trautmann, “Leveraging TSP Solver Complementarity through Machine Learning,” Evolutionary Computation, vol. 26, no. 4, pp. 597–620, Dec 2018.
  3. P. Kerschke and H. Trautmann, “Automated Algorithm Selection on Continuous Black-Box Problems by Combining Exploratory Landscape Analysis and Machine Learning,” Evolutionary Computation, vol. 27, no. 1, pp. 99–127, Mar 2019.
  4. P. Kerschke, H. H. Hoos, F. Neumann, and H. Trautmann, “Automated Algorithm Selection: Survey and Perspectives,” Evolutionary Computation, vol. 27, no. 1, pp. 3–45, Mar 2019.
  5. T. Jones and S. Forrest, “Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms,” in Proceedings of the 6th International Conference on Genetic Algorithms.   San Francisco, CA, USA: Morgan Kaufmann Publishers Inc., 1995, p. 184–192.
  6. G. Ochoa, M. Tomassini, S. Vérel, and C. Darabos, “A Study of NK Landscapes’ Basins and Local Optima Networks,” ser. GECCO ’08.   New York, NY, USA: Association for Computing Machinery, 2008, p. 555–562.
  7. R. P. Prager and H. Trautmann, “Investigating the Viability of Existing Exploratory Landscape Analysis Features for Mixed-Integer Problems,” in Proceedings of the Companion Conference on Genetic and Evolutionary Computation, ser. GECCO ’23 Companion.   New York, NY, USA: Association for Computing Machinery, 2023, p. 451–454.
  8. M. Pikalov and V. Mironovich, “Parameter Tuning for the (1+(λ𝜆\lambdaitalic_λ,λ𝜆\lambdaitalic_λ)) Genetic Algorithm Using Landscape Analysis and Machine Learning.”   Berlin, Heidelberg: Springer-Verlag, 2022, p. 704–720.
  9. J. Pelamatti, L. Brevault, M. Balesdent, E.-G. Talbi, and Y. Guerin, “How to Deal with Mixed-Variable Optimization Problems: An Overview of Algorithms and Formulations,” in Advances in Structural and Multidisciplinary Optimization, A. Schumacher, T. Vietor, S. Fiebig, K.-U. Bletzinger, and K. Maute, Eds.   Cham: Springer International Publishing, 2018, pp. 64–82.
  10. C. G. Pimenta, A. G. C. de Sá, G. Ochoa, and G. L. Pappa, “Fitness Landscape Analysis of Automated Machine Learning Search Spaces,” in Evolutionary Computation in Combinatorial Optimization, L. Paquete and C. Zarges, Eds.   Cham: Springer International Publishing, 2020, pp. 114–130.
  11. Y. Pushak and H. Hoos, “Algorithm Configuration Landscapes: More Benign than Expected?” in Parallel Problem Solving from Nature – PPSN XV, A. Auger, C. M. Fonseca, N. Lourenço, P. Machado, L. Paquete, and D. Whitley, Eds.   Cham: Springer International Publishing, 2018, pp. 271–283.
  12. ——, “AutoML Loss Landscapes,” ACM Trans. Evol. Learn. Optim., vol. 2, no. 3, Nov 2022.
  13. B. Bischl, M. Binder, M. Lang, T. Pielok, J. Richter, S. Coors, J. Thomas, T. Ullmann, M. Becker, A.-L. Boulesteix, D. Deng, and M. Lindauer, “Hyperparameter Optimization: Foundations, Algorithms, Best Practices, and Open Challenges,” WIREs Data Mining and Knowledge Discovery, vol. 13, no. 2, p. e1484, 2023.
  14. S. Lucidi, V. Piccialli, and M. Sciandrone, “An Algorithm Model for Mixed Variable Programming,” SIAM Journal on Optimization, vol. 15, no. 4, pp. 1057–1084, 2005.
  15. O. Mersmann, M. Preuss, and H. Trautmann, “Benchmarking Evolutionary Algorithms: Towards Exploratory Landscape Analysis,” in Parallel Problem Solving from Nature, PPSN XI, R. Schaefer, C. Cotta, J. Kołodziej, and G. Rudolph, Eds.   Berlin, Heidelberg: Springer Berlin Heidelberg, 2010, pp. 73–82.
  16. R. P. Prager, H. Trautmann, H. Wang, T. H. W. Bäck, and P. Kerschke, “Per-Instance Configuration of the Modularized CMA-ES by Means of Classifier Chains and Exploratory Landscape Analysis,” in Proceedings of the IEEE Symposium Series on Computational Intelligence (SSCI), Canberra, Australia, 2020, pp. 996–1003.
  17. P. Kerschke and H. Trautmann, “Comprehensive Feature-Based Landscape Analysis of Continuous and Constrained Optimization Problems Using the R-package flacco,” in Applications in Statistical Computing – From Music Data Analysis to Industrial Quality Improvement, ser. Studies in Classification, Data Analysis, and Knowledge Organization, N. Bauer, K. Ickstadt, K. Lübke, G. Szepannek, H. Trautmann, and M. Vichi, Eds.   Springer, 2019, pp. 93 – 123. [Online]. Available: https://link.springer.com/chapter/10.1007/978-3-030-25147-5_7
  18. R. P. Prager and H. Trautmann, “Pflacco: Feature-Based Landscape Analysis of Continuous and Constrained Optimization Problems in Python,” Evolutionary Computation, pp. 1–25, Jul 2023.
  19. M. Lunacek and D. Whitley, “The Dispersion Metric and the CMA Evolution Strategy,” in Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, ser. GECCO ’06.   New York, NY, USA: Association for Computing Machinery, 2006, p. 477–484.
  20. M. A. Muñoz Acosta, M. Kirley, and S. K. Halgamuge, “Exploratory Landscape Analysis of Continuous Space Optimization Problems Using Information Content,” IEEE Transactions on Evolutionary Computation (TEVC), vol. 19, no. 1, pp. 74 – 87, 2015.
  21. P. Kerschke, M. Preuss, S. Wessing, and H. Trautmann, “Detecting Funnel Structures by Means of Exploratory Landscape Analysis,” in Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, ser. GECCO ’15.   New York, NY, USA: Association for Computing Machinery, 2015, p. 265–272.
  22. N. Hansen, S. Finck, R. Ros, and A. Auger, “Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions,” INRIA, Research Report RR-6829, 2009. [Online]. Available: https://hal.inria.fr/inria-00362633
  23. R. P. Prager and H. Trautmann, “Nullifying the Inherent Bias of Non-Invariant Exploratory Landscape Analysis Features,” in Applications of Evolutionary Computation, J. Correia, S. Smith, and R. Qaddoura, Eds.   Cham: Springer International Publishing, 2023.
  24. J. T. Hancock and T. M. Khoshgoftaar, “Survey on Categorical Data for Neural Networks,” Journal of Big Data, vol. 7, no. 1, p. 28, Apr 2020.
  25. D. Micci-Barreca, “A Preprocessing Scheme for High-Cardinality Categorical Attributes in Classification and Prediction Problems,” SIGKDD Explor. Newsl., vol. 3, no. 1, p. 27–32, Jul 2001.
  26. F. Pfisterer, L. Schneider, J. Moosbauer, M. Binder, and B. Bischl, “YAHPO Gym - An Efficient Multi-Objective Multi-Fidelity Benchmark for Hyperparameter Optimization,” in Proceedings of the First International Conference on Automated Machine Learning, ser. Proceedings of Machine Learning Research, I. Guyon, M. Lindauer, M. van der Schaar, F. Hutter, and R. Garnett, Eds., vol. 188.   PMLR, 25–27 Jul 2022, pp. 3/1–39. [Online]. Available: https://proceedings.mlr.press/v188/pfisterer22a.html
  27. J. Vanschoren, J. N. van Rijn, B. Bischl, and L. Torgo, “OpenML: Networked Science in Machine Learning,” SIGKDD Explorations, vol. 15, no. 2, pp. 49–60, 2013.
  28. M. Lindauer, K. Eggensperger, M. Feurer, A. Biedenkapp, D. Deng, C. Benjamins, T. Ruhkopf, R. Sass, and F. Hutter, “SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization,” Journal of Machine Learning Research, vol. 23, pp. 1–9, 2022. [Online]. Available: https://www.jmlr.org/papers/volume23/21-0888/21-0888.pdf
  29. T. Akiba, S. Sano, T. Yanase, T. Ohta, and M. Koyama, “Optuna: A Next-generation Hyperparameter Optimization Framework,” in Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2019.
  30. J. Blank and K. Deb, “pymoo: Multi-Objective Optimization in Python,” IEEE Access, vol. 8, pp. 89 497–89 509, 2020.
  31. F. Hutter, H. Hoos, and K. Leyton-Brown, “An Evaluation of Sequential Model-Based Optimization for Expensive Blackbox Functions,” in Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation, ser. GECCO ’13 Companion.   New York, NY, USA: Association for Computing Machinery, 2013, p. 1209–1216.
  32. N. Hansen, A. Auger, S. Finck, and R. Ros, “Real-Parameter Black-Box Optimization Benchmarking 2010: Experimental Setup,” INRIA, Research Report RR-7215, Mar 2010. [Online]. Available: https://inria.hal.science/inria-00462481
  33. B. Bischl, P. Kerschke, L. Kotthoff, T. M. Lindauer, Y. Malitsky, A. Fréchette, H. H. Hoos, F. Hutter, K. Leyton-Brown, K. Tierney, and J. Vanschoren, “ASlib: A Benchmark Library for Algorithm Selection,” Artificial Intelligence, vol. 237, pp. 41 – 58, 2016.
  34. J. Bergstra and Y. Bengio, “Random Search for Hyper-Parameter Optimization,” Journal of Machine Learning Research, vol. 13, no. 25, pp. 281–305, 2012.
  35. M. Ester, H.-P. Kriegel, J. Sander, and X. Xu, “A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise,” in Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, ser. KDD’96.   AAAI Press, 1996, p. 226–231.
  36. F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay, “Scikit-learn: Machine Learning in Python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.
  37. S. Raschka, “MLxtend: Providing Machine Learning and Data Science Utilities and Extensions to Python’s Scientific Computing Stack,” The Journal of Open Source Software, vol. 3, no. 24, Apr 2018.
  38. Q. Renau, C. Doerr, J. Dreo, and B. Doerr, “Exploratory Landscape Analysis is Strongly Sensitive to the Sampling Strategy,” in Parallel Problem Solving from Nature – PPSN XVI, T. Bäck, M. Preuss, A. Deutz, H. Wang, C. Doerr, M. Emmerich, and H. Trautmann, Eds.   Cham: Springer International Publishing, 2020, pp. 139–153.
Citations (1)

Summary

We haven't generated a summary for this paper yet.