Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Algorithm-Selection and Performance-Prediction via Learning Discriminating Training Samples (2404.05359v1)

Published 8 Apr 2024 in cs.NE and cs.LG

Abstract: The choice of input-data used to train algorithm-selection models is recognised as being a critical part of the model success. Recently, feature-free methods for algorithm-selection that use short trajectories obtained from running a solver as input have shown promise. However, it is unclear to what extent these trajectories reliably discriminate between solvers. We propose a meta approach to generating discriminatory trajectories with respect to a portfolio of solvers. The algorithm-configuration tool irace is used to tune the parameters of a simple Simulated Annealing algorithm (SA) to produce trajectories that maximise the performance metrics of ML models trained on this data. We show that when the trajectories obtained from the tuned SA algorithm are used in ML models for algorithm-selection and performance prediction, we obtain significantly improved performance metrics compared to models trained both on raw trajectory data and on exploratory landscape features.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (50)
  1. Algorithm selection using deep learning without feature extraction. In Proceedings of the Genetic and Evolutionary Computation Conference. 198–206.
  2. Automated Algorithm Selection: from Feature-Based to Feature-Free Approaches. J. Heuristics 29, 1 (2023), 1–38. https://doi.org/10.1007/s10732-022-09505-4
  3. Anonymous. 2024. Improving Algorithm-Selectors and Performance-Predictors via Learning Discriminating Training Samples - Code and Data. https://doi.org/10.5281/zenodo.10579661
  4. N. Belkhir. 2017. Per Instance Algorithm Configuration for Continuous Black Box Optimization. phdthesis. Université Paris-Saclay. https://hal.inria.fr/tel-01669527/document
  5. Per instance algorithm configuration of CMA-ES with limited budget. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO ’17. ACM, 681–688. https://doi.org/10.1145/3071178.3071343
  6. Learning behaviour-performance maps with meta-evolution. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference. 49–57.
  7. D.M. Bossens and D. Tarapore. 2022. Quality-Diversity Meta-Evolution: Customizing Behavior Spaces to a Meta-Objective. IEEE Transactions on Evolutionary Computation 26, 5 (2022), 1171–1181.
  8. L. Breiman. 2001. Random Forests. Mach. Learn. 45, 1 (2001), 5–32. https://doi.org/10.1023/A:1010933404324
  9. DynamoRep: Trajectory-Based Population Dynamics for Classification of Black-box Optimization Problems. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2023, Lisbon, Portugal, July 15-19, 2023. ACM, 813–821. https://doi.org/10.1145/3583131.3590401
  10. Explorative data analysis of time series based algorithm features of CMA-ES variants. In GECCO ’21: Genetic and Evolutionary Computation Conference, Lille, France, July 10-14, 2021. ACM, 510–518. https://doi.org/10.1145/3449639.3459399
  11. New features for continuous exploratory landscape analysis based on the SOO tree. In Proceedings of Foundations of Genetic Algorithms (FOGA) ’19. ACM, 72–86. https://doi.org/10.1145/3299904.3340308
  12. Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions. http://coco.gforge.inria.fr/downloads/download16.00/bbobdocfunctions.pdf.
  13. COCO: a platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software 36, 1 (2021), 114–144. https://doi.org/10.1080/10556788.2020.1808977
  14. N. Hansen and A. Ostermeier. 2001. Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation 9, 2 (2001), 159–195. https://doi.org/10.1162/106365601750190398
  15. Towards Feature-Based Performance Regression Using Trajectory Data. In Applications of Evolutionary Computation - 24th International Conference, EvoApplications 2021, Held as Part of EvoStar 2021, Proceedings (Lecture Notes in Computer Science, Vol. 12694). Springer, 601–617. https://doi.org/10.1007/978-3-030-72699-7_38
  16. Trajectory-based Algorithm Selection with Warm-starting. In IEEE Congress on Evolutionary Computation, CEC 2022, Padua, Italy, July 18-23, 2022. IEEE, 1–8. https://doi.org/10.1109/CEC55065.2022.9870222
  17. J. Kennedy and R. Eberhart. 1995. Particle swarm optimization. In Proceedings of ICNN’95 - International Conference on Neural Networks, Vol. 4. 1942–1948 vol.4. https://doi.org/10.1109/ICNN.1995.488968
  18. Automated Algorithm Selection: Survey and Perspectives. Evolutionary Computation 27, 1 (March 2019), 3–45.
  19. Leveraging TSP Solver Complementarity through Machine Learning. Evol. Comput. 26, 4 (2018). https://doi.org/10.1162/EVCO_A_00215
  20. Detecting Funnel Structures by Means of Exploratory Landscape Analysis. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO ’15. ACM, 265–272. https://doi.org/10.1145/2739480.2754642
  21. Low-Budget Exploratory Landscape Analysis on Multiple Peaks Models. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO ’16. ACM, 229–236. https://doi.org/10.1145/2908812.2908845
  22. Optimization by Simmulated Annealing. Sci. 220, 4598 (1983), 671–680. https://doi.org/10.1126/science.220.4598.671
  23. Per-run Algorithm Selection with Warm-Starting Using Trajectory-Based Features. In Parallel Problem Solving from Nature - PPSN XVII - 17th International Conference, PPSN 2022, Dortmund, Germany, September 10-14, 2022, Proceedings, Part I (Lecture Notes in Computer Science, Vol. 13398). Springer, 46–60. https://doi.org/10.1007/978-3-031-14714-2_4
  24. J. Lehman and K.O. Stanley. 2011. Abandoning objectives: Evolution through the search for novelty alone. Evolutionary Computation 19, 2 (2011), 189–222. https://doi.org/10.1162/EVCO_a_00025
  25. Can genetic programming do manifold learning too?. In European Conference on Genetic Programming. Springer, 114–130.
  26. The irace package: Iterated racing for automatic algorithm configuration. Operations Research Perspectives 3 (2016), 43 – 58.
  27. Deep Learning for Algorithm Portfolios. In Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, February 12-17, 2016, Phoenix, Arizona, USA. AAAI Press, 1280–1286. https://doi.org/10.1609/AAAI.V30I1.10170
  28. Exploratory Landscape Analysis. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO ’11. ACM, 829–836. https://doi.org/10.1145/2001576.2001690
  29. Learning Behavior Characterizations for Novelty Search. In Proceedings of the Genetic and Evolutionary Computation Conference 2016 (Denver, Colorado, USA) (GECCO ’16). Association for Computing Machinery, New York, NY, USA, 149–156. https://doi.org/10.1145/2908812.2908929
  30. JB. Mouret and F. Clune. 2015. Illuminating search spaces by mapping elites. arXiv preprint arXiv:1504.04909 (2015).
  31. Exploratory Landscape Analysis of Continuous Space Optimization Problems Using Information Content. IEEE Transactions on Evolutionary Computation 19, 1 (Feb. 2015), 74–87. https://doi.org/10.1109/TEVC.2014.2302006
  32. A decade survey of transfer learning (2010–2020). IEEE Transactions on Artificial Intelligence 1, 2 (2020), 151–166.
  33. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12 (2011), 2825–2830.
  34. Quality diversity: A new frontier for evolutionary computation. Frontiers in Robotics and AI 3 (2016), 40.
  35. Q. Renau. 2022. Landscape-Aware Selection of Metaheuristics for the Optimization of Radar Networks. Ph. D. Dissertation. Polytechnic Institute of Paris, Palaiseau, France. https://tel.archives-ouvertes.fr/tel-03593606
  36. Exploratory Landscape Analysis Feature Values for the 24 Noiseless BBOB Functions. https://doi.org/10.5281/zenodo.4449934
  37. Towards Explainable Exploratory Landscape Analysis: Extreme Feature Selection for Classifying BBOB Functions. In Applications of Evolutionary Computation - 24th International Conference, EvoApplications 2021, Held as Part of EvoStar 2021, Proceedings (Lecture Notes in Computer Science, Vol. 12694). Springer, 17–33. https://doi.org/10.1007/978-3-030-72699-7_2
  38. Automated algorithm selection for radar network configuration. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO ’22, Jonathan E. Fieldsend and Markus Wagner (Eds.). ACM, 1263–1271. https://doi.org/10.1145/3512290.3528825
  39. Q. Renau and E. Hart. 2024. On the Utility of Probing Trajectories for Algorithm-Selection. arXiv:2401.12745
  40. J.R. Rice. 1976. The Algorithm Selection Problem. Advance Computing 15 (1976), 65–118. https://doi.org/10.1016/S0065-2458(08)60520-3
  41. Rotation Forest: A New Classifier Ensemble Method. IEEE Trans. Pattern Anal. Mach. Intell. 28, 10 (2006), 1619–1630. https://doi.org/10.1109/TPAMI.2006.211
  42. Deep learning as a competitive feature-free approach for automated algorithm selection on the traveling salesperson problem. In International Conference on Parallel Problem Solving from Nature. Springer, 48–64.
  43. K. Sim and E. Hart. 2022. Evolutionary Approaches to Improving the Layouts of Instance-Spaces. In Parallel Problem Solving from Nature - PPSN XVII - 17th International Conference, PPSN 2022, Dortmund, Germany, September 10-14, 2022, Proceedings, Part I (Lecture Notes in Computer Science, Vol. 13398). Springer, 207–219. https://doi.org/10.1007/978-3-031-14714-2_15
  44. Towards objective measures of algorithm performance across instance space. Computers & Operations Research 45 (2014), 12–24.
  45. R. Storn and K.V. Price. 1997. Differential Evolution - A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. Journal of Global Optimization 11, 4 (1997), 341–359. https://doi.org/10.1023/A:1008202821328
  46. R. Tanabe. 2022. Benchmarking Feature-Based Algorithm Selection Systems for Black-Box Numerical Optimization. IEEE Trans. Evol. Comput. 26, 6 (2022), 1321–1335. https://doi.org/10.1109/TEVC.2022.3169770
  47. To Switch or not to Switch: Predicting the Benefit of Switching between Algorithms based on Trajectory Features - Dataset. https://doi.org/10.5281/zenodo.7249389
  48. To Switch or Not to Switch: Predicting the Benefit of Switching Between Algorithms Based on Trajectory Features. In Applications of Evolutionary Computation. Springer Nature Switzerland, Cham, 335–350.
  49. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods 17 (2020), 261–272. https://doi.org/10.1038/s41592-019-0686-2
  50. Towards feature-free TSP solver selection: A deep learning approach. In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 1–8.
Citations (1)

Summary

We haven't generated a summary for this paper yet.