Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 33 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 74 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 362 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

An open-source solver for finding global solutions to constrained derivative-free optimization problems (2404.18080v1)

Published 28 Apr 2024 in math.OC

Abstract: In this work, we propose a heuristic based open source solver for finding global solution to constrained derivative-free optimization (DFO) problems. Our solver named Global optimization using Surrogates for Derivative-free Optimization (GSDO) relies on surrogate approximation to the original problem. In the proposed algorithm, an initial feasible point is first generated. This point is subsequently used to generate well spaced feasible points for formulating better radial basis function based surrogate approximations to original objective and constraint functions. Finally, these surrogates are used to solve the derivative-free global optimization problems. The proposed solver is capable of handling quantifiable and nonquantifiable as well as relaxable and unrelaxable constraints. We compared the performance of proposed solver with state of the art solvers like Nonlinear Optimization by Mesh Adaptive Direct Search (NOMAD), differential evolution (DE) and Simplicial Homology Global Optimization (SHGO) on standard test problems. The numerical results clearly demonstrate that the performance of our method is competitive with respect to other solvers.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. Introduction to derivative-free optimization, volume 8. SIAM, 2009.
  2. Derivative-free optimization methods. Acta Numerica, 28:287–404, 2019. doi: 10.1017/S0962492919000060.
  3. C. Audet and W. Hare. Derivative-Free and Blackbox Optimization. Springer Series in Operations Research and Financial Engineering. Springer International Publishing, 2017. doi: 10.1007/978-3-319-68913-5.
  4. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 11(4):341–359, 1997.
  5. A simplicial homology algorithm for Lipschitz optimisation. Journal of Global Optimization, 72(2):181–217, 2018.
  6. A taxonomy of constraints in black-box simulation-based optimization. Optimization and Engineering, pages 1–19, 2023.
  7. A linesearch-based derivative-free approach for nonsmooth constrained optimization. SIAM journal on optimization, 24(3):959–992, 2014.
  8. Optimization by direct search: New perspectives on some classical and modern methods. SIAM review, 45(3):385–482, 2003.
  9. Mesh adaptive direct search algorithms for constrained optimization. SIAM Journal on Optimization, 17(1):188–217, 2006. doi: doi:10.1137/040603371. URL http://dx.doi.org/doi:10.1137/040603371.
  10. Engineering design via surrogate modelling: a practical guide. John Wiley & Sons, 2008.
  11. Statistical surrogate formulations for simulation-based design optimization. Journal of Mechanical Design, 137(2), 2015.
  12. H-M Gutmann. A radial basis function method for global optimization. Journal of global optimization, 19(3):201–227, 2001.
  13. Global convergence of radial basis function trust-region algorithms for derivative-free optimization. SIAM REVIEW, 55(2):349–371, 2013.
  14. GOSAC: global optimization with surrogate approximation of constraints. Journal of Global Optimization, 69(1):117–136, 2017.
  15. Jack PC Kleijnen. Kriging metamodeling in simulation: A review. European journal of operational research, 192(3):707–716, 2009.
  16. Efficient global optimization of expensive black-box functions. Journal of Global optimization, 13(4):455–492, 1998.
  17. Response surface methodology: process and product optimization using designed experiments. John Wiley & Sons, 2016.
  18. Jerome H Friedman. Multivariate adaptive regression splines. The annals of statistics, pages 1–67, 1991.
  19. So-mi: A surrogate model algorithm for computationally expensive nonlinear mixed-integer black-box global optimization problems. Computers & Operations Research, 40(5):1383–1400, 2013.
  20. A stochastic radial basis function method for the global optimization of expensive functions. INFORMS Journal on Computing, 19(4):497–509, 2007.
  21. Derivative-free optimization: a review of algorithms and comparison of software implementations. Journal of Global Optimization, 56(3):1247–1293, 2013.
  22. RBFOPT: an open-source library for black-box optimization with costly function evaluations. Mathematical Programming Computation, 10(4):597–629, 2018.
  23. Lipschitzian optimization without the Lipschitz constant. Journal of optimization Theory and Applications, 79(1):157–181, 1993.
  24. Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Engineering Optimization, 45(5):529–555, 2013a.
  25. A quasi-multistart framework for global optimization of expensive functions using response surface models. Journal of Global Optimization, 56(4):1719–1753, 2013b.
  26. Surrogate optimization of computationally expensive black-box problems with hidden constraints. INFORMS Journal on Computing, 31(4):689–702, 2019.
  27. Snobfit–stable noisy optimization by branch and fit. ACM Transactions on Mathematical Software (TOMS), 35(2):1–25, 2008.
  28. Rommel G Regis. Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points. Engineering Optimization, 46(2):218–243, 2014.
  29. A kriging-based constrained global optimization algorithm for expensive black-box functions with infeasible initial points. Journal of Global Optimization, 67(1-2):343–366, 2017.
  30. Global optimization of costly nonconvex functions using radial basis functions. Optimization and Engineering, 1(4):373–397, 2000.
  31. Rommel G Regis. Stochastic radial basis function algorithms for large-scale optimization involving expensive black-box objective and constraint functions. Computers & Operations Research, 38(5):837–853, 2011.
  32. Michael JD Powell. The theory of radial basis function approximation in 1990. Advances in numerical analysis, 2:105–210, 1992.
  33. Kevin P Murphy. Machine learning: a probabilistic perspective. MIT press, 2012.
  34. Sébastien Le Digabel. Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Transactions on Mathematical Software, 37(4):1–15, 2011.
  35. Jouni Lampinen. A constraint handling approach for the differential evolution algorithm. In Proceedings of the 2002 Congress on Evolutionary Computation. CEC’02 (Cat. No. 02TH8600), volume 2, pages 1468–1473. IEEE, 2002.
  36. Benchmarking derivative-free optimization algorithms. SIAM Journal on Optimization, 20(1):172–191, 2009.
  37. Plant design and economics for chemical engineers, volume 4. McGraw-Hill New York, 2003.
  38. Nonsmooth optimization through Mesh Adaptive Direct Search and Variable Neighborhood Search. Journal of Global Optimization, 41(2):299–318, 2008. doi: 10.1007/s10898-007-9234-1. URL http://dx.doi.org/doi:10.1007/s10898-007-9234-1.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 0 likes.