Using the Empirical Attainment Function for Analyzing Single-objective Black-box Optimization Algorithms (2404.02031v3)
Abstract: A widely accepted way to assess the performance of iterative black-box optimizers is to analyze their empirical cumulative distribution function (ECDF) of pre-defined quality targets achieved not later than a given runtime. In this work, we consider an alternative approach, based on the empirical attainment function (EAF) and we show that the target-based ECDF is an approximation of the EAF. We argue that the EAF has several advantages over the target-based ECDF. In particular, it does not require defining a priori quality targets per function, captures performance differences more precisely, and enables the use of additional summary statistics that enrich the analysis. We also show that the average area over the convergence curves is a simpler-to-calculate, but equivalent, measure of anytime performance. To facilitate the accessibility of the EAF, we integrate a module to compute it into the IOHanalyzer platform. Finally, we illustrate the use of the EAF via synthetic examples and via the data available for the BBOB suite.
- N. Hansen, A. Auger, R. Ros, O. Mersmann, T. Tušar, and D. Brockhoff, “COCO: A platform for comparing continuous optimizers in a black-box setting,” Optimization Methods and Software, vol. 36, no. 1, pp. 1–31, 2020.
- T. A. Feo, M. G. C. Resende, and S. H. Smith, “A greedy randomized adaptive search procedure for maximum independent set,” Operations Research, vol. 42, pp. 860–878, Oct. 1994.
- H. H. Hoos and T. Stützle, “Evaluating Las Vegas algorithms — pitfalls and remedies,” in Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence, G. F. Cooper and S. Moral, Eds. Morgan Kaufmann Publishers, San Francisco, CA, 1998, pp. 238–245.
- N. Hansen, A. Auger, D. Brockhoff, and T. Tušar, “Anytime performance assessment in blackbox optimization benchmarking,” IEEE Transactions on Evolutionary Computation, vol. 26, no. 6, pp. 1293–1305, Dec. 2022.
- V. Grunert da Fonseca, C. M. Fonseca, and A. O. Hall, “Inferential performance assessment of stochastic optimisers and the attainment function,” in Evolutionary Multi-criterion Optimization, EMO 2001, ser. Lecture Notes in Computer Science, E. Zitzler, K. Deb, L. Thiele, C. A. Coello Coello, and D. Corne, Eds. Berlin/Heidelberg: Springer, 2001, vol. 1993, pp. 213–225.
- V. Grunert da Fonseca and C. M. Fonseca, “The attainment-function approach to stochastic multiobjective optimizer assessment and comparison,” in Experimental Methods for the Analysis of Optimization Algorithms, T. Bartz-Beielstein, M. Chiarandini, L. Paquete, and M. Preuss, Eds. Berlin, Germany: Springer, 2010, pp. 103–130.
- M. López-Ibáñez, L. Paquete, and T. Stützle, “Exploratory analysis of stochastic local search algorithms in biobjective optimization,” in Experimental Methods for the Analysis of Optimization Algorithms, T. Bartz-Beielstein, M. Chiarandini, L. Paquete, and M. Preuss, Eds. Berlin, Germany: Springer, 2010, pp. 209–222.
- J. Dréo, “Adaptation de la métaheuristique des colonies de fourmis pour l’optimisation difficile en variables continues: Application en génie biologique et médical,” Ph.D. dissertation, Université Paris XII Val de Marne, Dec. 2003. [Online]. Available: https://tel.archives-ouvertes.fr/tel-00093143
- M. Chiarandini, “Stochastic local search methods for highly constrained combinatorial optimisation problems,” Ph.D. dissertation, FB Informatik, TU Darmstadt, Germany, 2005.
- J. Dréo, “Using performance fronts for parameter setting of stochastic metaheuristics,” in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO Companion 2009, F. Rothlauf, Ed. New York, NY: ACM Press, 2009, pp. 2197–2200.
- H. Wang, D. Vermetten, F. Ye, C. Doerr, and T. Bäck, “IOHanalyzer: Detailed performance analyses for iterative optimization heuristics,” ACM Transactions on Evolutionary Learning and Optimization, vol. 2, no. 1, pp. 3:1–3:29, 2022.
- N. Hansen, S. Finck, R. Ros, and A. Auger, “Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions,” INRIA, France, Tech. Rep. RR-6829, 2009, updated February 2010.
- N. Hansen, A. Auger, R. Ros, S. Finck, and P. Pošík, “Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009,” in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO Companion 2010, M. Pelikan and J. Branke, Eds. New York, NY: ACM Press, 2010, pp. 1689–1696.
- V. Grunert da Fonseca and C. M. Fonseca, “A link between the multivariate cumulative distribution function and the hitting function for random closed sets,” Statistics & Probability Letters, vol. 57, no. 2, pp. 179–182, 2002.
- F. Ye, C. Doerr, H. Wang, and T. Bäck, “Automated configuration of genetic algorithms by tuning for anytime performance,” IEEE Transactions on Evolutionary Computation, vol. 26, no. 6, pp. 1526–1538, 2022.
- E. Zitzler and L. Thiele, “Multiobjective optimization using evolutionary algorithms - A comparative case study,” in Parallel Problem Solving from Nature – PPSN V, ser. Lecture Notes in Computer Science, A. E. Eiben, T. Bäck, M. Schoenauer, and H.-P. Schwefel, Eds. Heidelberg, Germany: Springer, 1998, vol. 1498, pp. 292–301.
- N. Beume, C. M. Fonseca, M. López-Ibáñez, L. Paquete, and J. Vahrenhold, “On the complexity of computing the hypervolume indicator,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 5, pp. 1075–1082, 2009.
- M. López-Ibáñez and T. Stützle, “Automatically improving the anytime behaviour of optimisation algorithms,” European Journal of Operational Research, vol. 235, no. 3, pp. 569–582, 2014.
- C. M. Fonseca, A. P. Guerreiro, M. López-Ibáñez, and L. Paquete, “On the computation of the empirical attainment function,” in Evolutionary Multi-criterion Optimization, EMO 2011, ser. Lecture Notes in Computer Science, R. H. C. Takahashi, K. Deb, E. F. Wanner, and S. Greco, Eds. Berlin/Heidelberg: Springer, 2011, vol. 6576, pp. 106–120.
- J. Rapin and O. Teytaud, “Nevergrad: A gradient-free optimization platform,” https://github.com/FacebookResearch/Nevergrad, 2018.
- J. de Nobel, F. Ye, D. Vermetten, H. Wang, C. Doerr, and T. Bäck, “IOHexperimenter: Benchmarking platform for iterative optimization heuristics,” Evolutionary Computation, pp. 1–6, 2024.
- C. G. Broyden, “The convergence of a class of double-rank minimization algorithms: 2. the new algorithm,” IMA Journal of Applied Mathematics, vol. 6, no. 3, pp. 222–231, Sep. 1970.
- R. Fletcher, “A new approach to variable metric algorithms,” The Computer Journal, vol. 13, no. 3, pp. 317–322, Sep. 1970.
- D. Goldfarb, “A family of variable-metric methods derived by variational means,” Mathematics of Computation, vol. 24, no. 109, pp. 23–26, 1970.
- D. F. Shanno, “Conditioning of quasi-newton methods for function minimization,” Mathematics of Computation, vol. 24, no. 111, pp. 647–656, 1970.
- P. Virtanen et al., “SciPy 1.0: Fundamental algorithms for scientific computing in Python,” Nature Methods, vol. 17, pp. 261–272, 2020.
- N. Hansen and A. Ostermeier, “Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation,” in Proceedings of the 1996 IEEE International Conference on Evolutionary Computation (ICEC’96), T. Bäck, T. Fukuda, and Z. Michalewicz, Eds. Piscataway, NJ: IEEE Press, 1996, pp. 312–317.
- N. Hansen, Y. Akimoto, and P. Baudis, “CMA-ES/pycma on Github,” Zenodo, Feb. 2019.
- R. Storn and K. Price, “Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997.
- M. J. D. Powell, “A direct search optimization method that models the objective and constraint functions by linear interpolation,” in Advances in Optimization and Numerical Analysis. Springer, 1994, pp. 51–67.
- M. Locatelli and F. Schoen, “Random linkage: a family of acceptance/rejection algorithms for global optimisation.” Mathematical Programming, vol. 85, no. 2, 1999.
- M. Binois, D. Ginsbourger, and O. Roustant, “Quantifying uncertainty on Pareto fronts with Gaussian process conditional simulations,” European Journal of Operational Research, vol. 243, no. 2, pp. 386–394, 2015.
- C. M. Fonseca, V. Grunert da Fonseca, and L. Paquete, “Exploring the performance of stochastic multiobjective optimisers with the second-order attainment function,” in Evolutionary Multi-criterion Optimization, EMO 2005, ser. Lecture Notes in Computer Science, C. A. Coello Coello, A. Hernández Aguirre, and E. Zitzler, Eds. Heidelberg, Germany: Springer, 2005, vol. 3410, pp. 250–264.
- A. Radulescu, M. López-Ibáñez, and T. Stützle, “Automatically improving the anytime behaviour of multiobjective evolutionary algorithms,” in Evolutionary Multi-criterion Optimization, EMO 2013, ser. Lecture Notes in Computer Science, R. C. Purshouse, P. J. Fleming, C. M. Fonseca, S. Greco, and J. Shaw, Eds. Heidelberg, Germany: Springer, 2013, vol. 7811, pp. 825–840.
- D. Brockhoff, T. Tušar, D. Tušar, T. Wagner, N. Hansen, and A. Auger, “Biobjective performance assessment with the COCO platform,” Arxiv preprint arXiv:1605.01746, 2016.
- T. Tušar and B. Filipič, “Visualizing 4D approximation sets of multiobjective optimizers with prosections,” in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2011, N. Krasnogor and P. L. Lanzi, Eds. New York, NY: ACM Press, 2011, pp. 737–744.