Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantifying Individual and Joint Module Impact in Modular Optimization Frameworks (2405.11964v1)

Published 20 May 2024 in cs.NE

Abstract: This study explores the influence of modules on the performance of modular optimization frameworks for continuous single-objective black-box optimization. There is an extensive variety of modules to choose from when designing algorithm variants, however, there is a rather limited understanding of how each module individually influences the algorithm performance and how the modules interact with each other when combined. We use the functional ANOVA (f-ANOVA) framework to quantify the influence of individual modules and module combinations for two algorithms, the modular Covariance Matrix Adaptation (modCMA) and the modular Differential Evolution (modDE). We analyze the performance data from 324 modCMA and 576 modDE variants on the BBOB benchmark collection, for two problem dimensions, and three computational budgets. Noteworthy findings include the identification of important modules that strongly influence the performance of modCMA, such as the~\textit{weights\ option} and~\textit{mirrored} modules for low dimensional problems, and the~\textit{base\ sampler} for high dimensional problems. The large individual influence of the~\textit{lpsr} module makes it very important for the performance of modDE, regardless of the problem dimensionality and the computational budget. When comparing modCMA and modDE, modDE undergoes a shift from individual modules being more influential, to module combinations being more influential, while modCMA follows the opposite pattern, with an increase in problem dimensionality and computational budget.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. J. Stork, A. E. Eiben, and T. Bartz-Beielstein, “A new taxonomy of global optimization algorithms,” Natural Computing, vol. 21, no. 2, pp. 219–242, 2022. [Online]. Available: https://doi.org/10.1007/s11047-020-09820-4
  2. J. Dreo, A. Liefooghe, S. Verel, M. Schoenauer, J. J. Merelo, A. Quemy, B. Bouvier, and J. Gmys, “Paradiseo: from a modular framework for evolutionary computation to the automated design of metaheuristics: 22 years of paradiseo,” in Proceedings of the Genetic and Evolutionary Computation Conference Companion, 2021, pp. 1522–1530.
  3. J. de Nobel, D. Vermetten, H. Wang, C. Doerr, and T. Bäck, “Tuning as a means of assessing the benefits of new ideas in interplay with existing algorithmic modules,” in Proceedings of the Genetic and Evolutionary Computation Conference Companion, 2021, pp. 1375–1384.
  4. D. Vermetten, F. Caraffini, A. V. Kononova, and T. Bäck, “Modular differential evolution,” in Proceedings of the Genetic and Evolutionary Computation Conference.   ACM, 2023, p. 864–872.
  5. C. L. Camacho-Villalón, M. Dorigo, and T. Stützle, “Pso-x: A component-based framework for the automatic design of particle swarm optimization algorithms,” IEEE Transactions on Evolutionary Computation, vol. 26, no. 3, pp. 402–416, 2021.
  6. N. Hansen, A. Auger, and D. Brockhoff, “Data from the BBOB workshops,” https://numbbo.github.io/data-archive/bbob/, 2020.
  7. N. Hansen and A. Ostermeier, “Completely derandomized self-adaptation in evolution strategies,” Evolutionary computation, vol. 9, no. 2, pp. 159–195, 2001.
  8. F. Hutter, H. Hoos, and K. Leyton-Brown, “An efficient approach for assessing hyperparameter importance,” in International conference on machine learning.   PMLR, 2014, pp. 754–762.
  9. A. Nikolikj. (2024) Footprints mtr. [Online]. Available: https://github.com/anikolik/fanova.git
  10. S. van Rijn, H. Wang, B. van Stein, and T. Bäck, “Algorithm configuration data mining for CMA evolution strategies,” in Proceedings of the Genetic and Evolutionary Computation Conference, 2017, pp. 737–744.
  11. M. López-Ibáñez, J. Dubois-Lacoste, L. P. Cáceres, M. Birattari, and T. Stützle, “The irace package: Iterated racing for automatic algorithm configuration,” Operations Research Perspectives, vol. 3, pp. 43–58, 2016.
  12. G. Hooker, “Generalized functional anova diagnostics for high-dimensional functions of dependent variables,” Journal of Computational and Graphical Statistics, vol. 16, no. 3, pp. 709–732, 2007.
  13. G. Biau and E. Scornet, “A random forest guided tour,” Test, vol. 25, no. 2, pp. 197–227, 2016.
  14. A. Kostovska, D. Vermetten, S. Džeroski, P. Panov, T. Eftimov, and C. Doerr, “Using knowledge graphs for performance prediction of modular optimization algorithms,” in International Conference on the Applications of Evolutionary Computation (Part of EvoStar).   Springer, 2023, pp. 253–268.
  15. J. de Nobel, F. Ye, D. Vermetten, H. Wang, C. Doerr, and T. Bäck, “IOHexperimenter: benchmarking platform for iterative optimization heuristics,” CoRR, vol. abs/2111.04077, 2021. [Online]. Available: https://arxiv.org/abs/2111.04077
  16. N. Hansen, S. Finck, R. Ros, and A. Auger, “Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions,” INRIA, Tech. Rep. RR-6829, 2009. [Online]. Available: https://hal.inria.fr/inria-00362633/document
  17. N. Hansen, A. Auger, R. Ros, O. Mersmann, T. Tušar, and D. Brockhoff, “COCO: A platform for comparing continuous optimizers in a black-box setting,” Optimization Methods and Software, vol. 36, pp. 114–144, 2020.
  18. R. Tanabe and A. Fukunaga, “Success-history based parameter adaptation for differential evolution,” in 2013 IEEE congress on evolutionary computation.   IEEE, 2013, pp. 71–78.
  19. S. Watanabe, A. Bansal, and F. Hutter, “PED-ANOVA: efficiently quantifying hyperparameter importance in arbitrary subspaces,” in Proceedings of International Joint Conference on Artificial Intelligence, IJCAI-23, 2023, pp. 4389–4396.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets