Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Performance Analysis of Basin Hopping Compared to Established Metaheuristics for Global Optimization (2403.05877v1)

Published 9 Mar 2024 in cs.NE and cs.PF

Abstract: During the last decades many metaheuristics for global numerical optimization have been proposed. Among them, Basin Hopping is very simple and straightforward to implement, although rarely used outside its original Physical Chemistry community. In this work, our aim is to compare Basin Hopping, and two population variants of it, with readily available implementations of the well known metaheuristics Differential Evolution, Particle Swarm Optimization, and Covariance Matrix Adaptation Evolution Strategy. We perform numerical experiments using the IOH profiler environment with the BBOB test function set and two difficult real-world problems. The experiments were carried out in two different but complementary ways: by measuring the performance under a fixed budget of function evaluations and by considering a fixed target value. The general conclusion is that Basin Hopping and its newly introduced population variant are almost as good as Covariance Matrix Adaptation on the synthetic benchmark functions and better than it on the two hard cluster energy minimization problems. Thus, the proposed analyses show that Basin Hopping can be considered a good candidate for global numerical optimization problems along with the more established metaheuristics, especially if one wants to obtain quick and reliable results on an unknown problem.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (39)
  1. Derivative-free and blackbox optimization. Springer International Publishing, Cham, 2017.
  2. Contemporary Evolution Strategies. Springer, Berlin, Heidelberg, 2013.
  3. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 11(4):341–359, 1997.
  4. Optimization by simulated annealing. Science, 220(4598):671–680, 1983.
  5. Particle swarm optimization. In Proceedings of ICNN’95-International Conference on Neural Networks, volume 4, pages 1942–1948. IEEE, 1995.
  6. Leo Liberti. Introduction to global optimization. Ecole Polytechnique, 2008.
  7. M.Locatelli and F. Schoen. Global optimization: theory, algorithms, and applications. Society for Industrial and Applied Mathematics, Philadelphia, USA, 2013.
  8. Algorithms for optimization. MIT Press, Cambridge, Massachusetts, 2019.
  9. On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Scientific Reports, 8(1):1–9, 2018.
  10. Benchmarking and comparison of nature-inspired population-based continuous optimisation algorithms. Soft Computing, 18(5):871–903, 2014.
  11. Global optimization by basin-hopping and the lowest energy structures of Lennard-Jones clusters containing up to 110 atoms. The Journal of Physical Chemistry A, 101(28):5111–5116, 1997.
  12. Global optimization of clusters, crystals, and biomolecules. Science, 285(5432):1368–1372, 1999.
  13. Global optimization of Morse clusters by potential energy transformations. INFORMS Journal on Computing, 16(4):371–379, 2004.
  14. Basin hopping graph: a computational framework to characterize RNA folding landscapes. Bioinformatics, 30(14):2009–2017, 2014.
  15. Augmenting basin-hopping with techniques from unsupervised machine learning: Applications in spectroscopy and ion mobility. Frontiers in Chemistry, 7:519, 2019.
  16. Crystal structure prediction for benzene using basin-hopping global optimization. The Journal of Physical Chemistry A, 125(17):3776–3784, 2021.
  17. Comparing basin hopping with differential evolution and particle swarm optimization. In International Conference on the Applications of Evolutionary Computation (Part of EvoStar), pages 46–60. Springer, 2022.
  18. Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. PhD thesis, INRIA, 2009.
  19. IOHprofiler: A benchmarking and profiling tool for iterative optimization heuristics. arXiv preprint arXiv:1810.05281, 2018.
  20. P. Baudis. COCOpf: An algorithm portfolio framework. arXiv preprint arXiv:1405.3487, 2014.
  21. An experimental analysis of a population based approach for global optimization. Computational Optimization and Applications, 38(3):351–370, 2007.
  22. A population-based approach for hard global optimization problems based on dissimilarity measures. Mathematical Programming, 110(2):373–404, 2007.
  23. R. H. Leary. Global optimization on funneling landscapes. Journal of Global Optimization, 18(4):367–383, 2000.
  24. On the limited memory BFGS method for large scale optimization. Mathematical programming, 45(1-3):503–528, 1989.
  25. A simplex method for function minimization. The Computer Journal, 7(4):308–313, 1965.
  26. Michael JD Powell. An efficient method for finding the minimum of a function of several variables without calculating derivatives. The computer journal, 7(2):155–162, 1964.
  27. Iterated local search: Framework and applications. In Handbook of metaheuristics, pages 129–168. Springer, Boston, MA, 2019.
  28. N. Hansen and A. Ostermeier. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of IEEE International Conference on Evolutionary Computation, pages 312–317. IEEE, 1996.
  29. J. Rapin and O. Teytaud. Nevergrad - A gradient-free optimization platform. https://GitHub.com/FacebookResearch/Nevergrad, 2018.
  30. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods, 17:261–272, 2020.
  31. L2 discrepancy of generalized two-dimensional hammersley point sets scrambled with arbitrary permutations. Acta Arithmetica, 4(141):395–418, 2010.
  32. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1):67–82, 1997.
  33. No-free-lunch theorems in the continuum. Theoretical Computer Science, 600:98–106, 2015.
  34. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 1(1):3–18, 2011.
  35. Nonparametric statistical methods, volume 751. John Wiley & Sons, Ltd, Hoboken, New Jersey, USA, 2013.
  36. IOHanalyzer: Detailed performance analyses for iterative optimization heuristics. ACM Transactions on Evolutionary Learning and Optimization, 2(1):1–29, 2022.
  37. Janez Demšar. Statistical comparisons of classifiers over multiple data sets. The Journal of Machine Learning Research, 7:1–30, 2006.
  38. Deep learning for time series classification: a review. Data Mining and Knowledge Discovery, 33(4):917–963, 2019.
  39. S. Das and P. N.Suganthan. Problem definitions and evaluation criteria for cec 2011 competition on testing evolutionary algorithms on real world optimization problems. Jadavpur University, Nanyang Technological University, Kolkata, pages 341–359, 2010.
Citations (1)

Summary

We haven't generated a summary for this paper yet.