Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

cmaes : A Simple yet Practical Python Library for CMA-ES (2402.01373v2)

Published 2 Feb 2024 in cs.NE and cs.MS

Abstract: The covariance matrix adaptation evolution strategy (CMA-ES) has been highly effective in black-box continuous optimization, as demonstrated by its success in both benchmark problems and various real-world applications. To address the need for an accessible yet potent tool in this domain, we developed cmaes, a simple and practical Python library for CMA-ES. cmaes is characterized by its simplicity, offering intuitive use and high code readability. This makes it suitable for quickly using CMA-ES, as well as for educational purposes and seamless integration into other libraries. Despite its simplistic design, cmaes maintains enhanced functionality. It incorporates recent advancements in CMA-ES, such as learning rate adaptation for challenging scenarios, transfer learning, and mixed-integer optimization capabilities. These advanced features are accessible through a user-friendly API, ensuring that cmaes can be easily adopted in practical applications. We regard cmaes as the first choice for a Python CMA-ES library among practitioners. The software is available under the MIT license at https://github.com/CyberAgentAILab/cmaes.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2623–2631.
  2. Comparison-Based Natural Gradient Optimization in High Dimension. In Proceedings of the Genetic and Evolutionary Computation Conference. 373–380.
  3. Youhei Akimoto and Nikolaus Hansen. 2016. Projection-Based Restricted Covariance Matrix Adaptation for High Dimension. In Proceedings of the Genetic and Evolutionary Computation Conference. 197–204.
  4. Youhei Akimoto and Nikolaus Hansen. 2020. Diagonal Acceleration for Covariance Matrix Adaptation Evolution Strategies. Evolutionary Computation 28, 3 (2020), 405–435.
  5. Bidirectional Relation between CMA Evolution Strategies and Natural Evolution Strategies. In International Conference on Parallel Problem Solving from Nature. 154–163.
  6. Florin Andrei. 2024. Efficient Feature Selection via CMA-ES (Covariance Matrix Adaptation Evolution Strategy). https://towardsdatascience.com/efficient-feature-selection-via-cma-es-covariance-matrix-adaptation-evolution-strategy-ee312bc7b173/. Retrieved 17-January-2024.
  7. Linearly Convergent Evolution Strategies via Augmented Lagrangian Constraint Handling. In Proceedings of the ACM/SIGEVO Conference on Foundations of Genetic Algorithms. 149–161.
  8. Anne Auger and Nikolaus Hansen. 2005. A Restart CMA Evolution Strategy With Increasing Population Size. In IEEE Congress on Evolutionary Computation.
  9. Python Code Quality Authority. 2010. Flake8. https://github.com/PyCQA/flake8/.
  10. Julian Blank and Kalyanmoy Deb. 2020. Pymoo: Multi-Objective Optimization in Python. IEEE Access 8 (2020), 89497–89509.
  11. Object-Oriented Programming of Optimizers–Examples in Scilab. Multidisciplinary Design Optimization in Computational Mechanics (2013), 499–538.
  12. Matthias Feurer and Frank Hutter. 2019. Hyperparameter Optimization. Automated Machine Learning: Methods, Systems, Challenges (2019), 3–33.
  13. Python Software Foundation. 2003. The Python Package Index. https://pypi.org/.
  14. Python Software Foundation. 2018. Black. https://github.com/psf/black/.
  15. Peter I Frazier. 2018. A Tutorial on Bayesian Optimization. arXiv preprint arXiv:1807.02811 (2018).
  16. A Scalable and Cloud-Native Hyperparameter Tuning System. arXiv preprint arXiv:2006.02085 (2020).
  17. Google. 2021. Atheris. https://github.com/google/atheris/.
  18. David Ha and Jürgen Schmidhuber. 2018. World Models. arXiv preprint arXiv:1803.10122 (2018).
  19. CMA-ES with Margin: Lower-Bounding Marginal Probability for Mixed-Integer Black-Box Optimization. In Proceedings of the Genetic and Evolutionary Computation Conference. 639–647.
  20. Marginal Probability-Based Integer Handling for CMA-ES Tackling Single-and Multi-Objective Mixed-Integer Black-Box Optimization. ACM Transactions on Evolutionary Learning (2024).
  21. Nikolaus Hansen. 2009. Benchmarking a BI-Population CMA-ES on the BBOB-2009 Function Testbed. In Proceedings of the Genetic and Evolutionary Computation Conference: Late Breaking Papers. 2389–2396.
  22. Nikolaus Hansen. 2016. The CMA Evolution Strategy: A Tutorial. arXiv preprint arXiv:1604.00772 (2016).
  23. Nikolaus Hansen. 2019. A Global Surrogate Assisted CMA-ES. In Proceedings of the Genetic and Evolutionary Computation Conference. 664–672.
  24. CMA-ES/pycma on Github. Zenodo, DOI:10.5281/zenodo.2559634.
  25. Nikolaus Hansen and Anne Auger. 2013. Principled Design of Continuous Stochastic Search: From Theory to Practice. In Theory and Principled Methods for the Design of Metaheuristics. 145–180.
  26. COCO: A Platform for Comparing Continuous Optimizers in a Black-Box Setting. Optimization Methods and Software 36, 1 (2021), 114–144.
  27. Nikolaus Hansen and Stefan Kern. 2004. Evaluating the CMA Evolution Strategy on Multimodal Test Functions. In International Conference on Parallel Problem Solving from Nature. 282–291.
  28. Nikolaus Hansen and Andreas Ostermeier. 2001. Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary computation 9, 2 (2001), 159–195.
  29. Array programming with NumPy. Nature 585, 7825 (Sept. 2020), 357–362.
  30. Modeling Visual Containment for Web Page Layout Optimization. In Computer Graphics Forum, Vol. 40. 33–44.
  31. Constrained Graphic Layout Generation via Latent Optimization. In Proceedings of the ACM International Conference on Multimedia. 88–96.
  32. Robert Tjarko Lange. 2023. evosax: JAX-based Evolution Strategies. In Proceedings of the Genetic and Evolutionary Computation Companion. 659–662.
  33. Hypothesis: A new approach to property-based testing. Journal of Open Source Software 4, 43 (2019), 1891.
  34. Well Placement Optimization for Carbon Dioxide Capture and Storage via CMA-ES with Mixed Integer Support. In Proceedings of the Genetic and Evolutionary Computation Conference Companion. 1696–1703.
  35. Hidekazu Miyazawa and Youhei Akimoto. 2017. Effect of the Mean Vector Learning Rate in CMA-ES. In Proceedings of the Genetic and Evolutionary Computation Conference. 721–728.
  36. Kouhei Nishida and Youhei Akimoto. 2018. PSA-CMA-ES: CMA-ES with Population Size Adaptation. In Proceedings of the Genetic and Evolutionary Computation Conference. 865–872.
  37. CMA-ES with Learning Rate Adaptation: Can CMA-ES with Default Population Size Solve Multimodal and Noisy Problems?. In Proceedings of the Genetic and Evolutionary Computation Conference. 839–847.
  38. CMA-ES with Learning Rate Adaptation. arXiv preprint arXiv:2401.15876 (2024).
  39. Warm Starting CMA-ES for Hyperparameter Optimization. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 9188–9196.
  40. Lennart Oswald Purucker and Joeran Beel. 2023. CMA-ES for Post Hoc Ensembling in AutoML: A Great Success and Salvageable Failure. In AutoML Conference 2023.
  41. Python. 2014. Mypy. https://github.com/python/mypy/.
  42. Jeremy Rapin and Olivier Teytaud. 2018. Nevergrad - A gradient-free optimization platform. https://GitHub.com/FacebookResearch/Nevergrad.
  43. Luis Miguel Rios and Nikolaos V Sahinidis. 2013. Derivative-free optimization: A review of algorithms and comparison of software implementations. Journal of Global Optimization 56, 3 (2013), 1247–1293.
  44. Raymond Ros and Nikolaus Hansen. 2008. A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity. In International Conference on Parallel Problem Solving from Nature. 296–305.
  45. Petru Rares Sincraian. 2018. PePy. https://www.pepy.tech/. Retrieved 17-January-2024.
  46. BBTv2: Towards a Gradient-Free Future with Large Language Models. In Proceedings of the Conference on Empirical Methods in Natural Language Processing. 3916–3930.
  47. Black-Box Tuning for Language-Model-as-a-Service. In International Conference on Machine Learning. 20841–20855.
  48. EvoJAX: Hardware-Accelerated Neuroevolution. In Proceedings of the Genetic and Evolutionary Computation Conference Companion. 308–311.
  49. Optuna Development Team. 2019. kurobako. https://github.com/optuna/kurobako/.
  50. When to be Discrete: Analyzing Algorithm Performance on Discretized Continuous Problems. In Proceedings of the Genetic and Evolutionary Computation Conference. 856–863.
  51. (1+1)-CMA-ES with Margin for Discrete and Mixed-Integer Problems. In Proceedings of the Genetic and Evolutionary Computation Conference. 882–890.
  52. Natural Evolution Strategies. The Journal of Machine Learning Research 15, 1 (2014), 949–980.
  53. Bayesian Optimization for Materials Design with Mixed Quantitative and Qualitative Variables. Scientific reports 10, 1 (2020), 4924.
Citations (14)

Summary

  • The paper presents cmaes as a minimal yet practical tool for implementing CMA-ES, offering adaptive learning rates and a user-friendly API to simplify complex optimization tasks.
  • The paper demonstrates innovative methods such as WS-CMA for transfer learning and CMAwM for mixed-integer optimization, enhancing performance in noisy and multimodal environments.
  • The paper highlights rigorous software practices including continuous integration, benchmarking, and fuzz testing, ensuring robust integration with frameworks like Optuna.

Overview of the cmaes Python Library for CMA-ES

This paper introduces "cmaes," a Python library crafted to facilitate the implementation of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), which is widely regarded for its effectiveness in black-box continuous optimization. Developed to balance simplicity and practicality, cmaes offers an accessible yet potent tool for researchers and practitioners alike.

Key Features and Design Philosophy

The core appeal of cmaes lies in its minimalist design coupled with high functionality. Prioritizing simplicity, the library boasts high code readability, making it conducive for educational use and ideal for swift integration into other computational frameworks. This simplicity extends to its API, enabling intuitive use without compromising on functionality.

In terms of practicality, cmaes integrates advancements in CMA-ES, such as learning rate adaptation for challenging optimization scenarios, transfer learning capabilities, and mixed-integer optimization support. These features are accessible through straightforward APIs, which retains the library's usability while enhancing its utility in real-world applications.

Notable Advancements and Methods

  1. LRA-CMA:
    • This method introduces automatic learning rate adaptation to CMA-ES, particularly for solving multimodal and noisy problems. By adjusting the learning rate to maintain a constant signal-to-noise ratio, it addresses optimization challenges without the need for extensive hyperparameter tuning.
  2. WS-CMA:
    • WS-CMA offers a transfer learning approach, leveraging solutions from previously solved similar problems to inform the optimization of new tasks. This results in significant acceleration for tasks like hyperparameter optimization where evaluation costs are high.
  3. CMAwM:
    • To handle mixed-integer optimization problems effectively, CMAwM introduces margin correction in distribution parameters. This prevents biased searches and promotes balanced exploration in binary or integer domains, maintaining CMA-ES's strengths for continuous variables.

Software Quality and Integration

To ensure software robustness, cmaes employs continuous integration techniques, quick benchmarking procedures, and fuzz testing to identify potential errors. These measures guarantee that the library remains reliable and efficient over time.

The library's ability to seamlessly integrate with other systems is illustrated through its use in Optuna, a prominent hyperparameter optimization library. The reduced size of serialization via custom state management in cmaes makes it especially advantageous for applications requiring frequent saving and loading of optimizer states.

Implications and Future Directions

cmaes’s design philosophy—balancing simplicity with cutting-edge functionality—positions it as a primary resource for CMA-ES practitioners. While the library may not encompass the exhaustive feature set of larger libraries like pycma, its focus on essential features and user-friendly design make it a valuable addition to the optimization toolkit.

Future enhancements could explore extending its capabilities to more specialized domains, adapting emerging techniques in optimization, and fostering community-driven development. Such developments would further solidify cmaes's role in bridging academic research and applied optimization strategies.

In conclusion, cmaes provides a streamlined, practical approach to implementing CMA-ES in Python, making it an attractive tool for those engaged in both research and applied optimization projects.

Github Logo Streamline Icon: https://streamlinehq.com