Papers
Topics
Authors
Recent
Search
2000 character limit reached

CMA-ES for Safe Optimization

Published 17 May 2024 in cs.NE | (2405.10534v1)

Abstract: In several real-world applications in medical and control engineering, there are unsafe solutions whose evaluations involve inherent risk. This optimization setting is known as safe optimization and formulated as a specialized type of constrained optimization problem with constraints for safety functions. Safe optimization requires performing efficient optimization without evaluating unsafe solutions. A few studies have proposed the optimization methods for safe optimization based on Bayesian optimization and the evolutionary algorithm. However, Bayesian optimization-based methods often struggle to achieve superior solutions, and the evolutionary algorithm-based method fails to effectively reduce unsafe evaluations. This study focuses on CMA-ES as an efficient evolutionary algorithm and proposes an optimization method termed safe CMA-ES. The safe CMA-ES is designed to achieve both safety and efficiency in safe optimization. The safe CMA-ES estimates the Lipschitz constants of safety functions transformed with the distribution parameters using the maximum norm of the gradient in Gaussian process regression. Subsequently, the safe CMA-ES projects the samples to the nearest point in the safe region constructed with the estimated Lipschitz constants. The numerical simulation using the benchmark functions shows that the safe CMA-ES successfully performs optimization, suppressing the unsafe evaluations, while the existing methods struggle to significantly reduce the unsafe evaluations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. Augmented Lagrangian Constraint Handling for CMA-ES — Case of a Single Linear Constraint. In Parallel Problem Solving from Nature – PPSN XIV. Springer International Publishing, Cham, 181–191.
  2. Bayesian optimization with safety constraints: safe and automatic parameter tuning in robotics. Machine Learning 112, 10 (2023), 3713–3747. https://doi.org/10.1007/s10994-021-06019-1
  3. Safe controller optimization for quadrotors with Gaussian processes. In 2016 IEEE International Conference on Robotics and Automation (ICRA). 491–496. https://doi.org/10.1109/ICRA.2016.7487170
  4. Constrained Bayesian Optimization with Particle Swarms for Safe Adaptive Controller Tuning. IFAC-PapersOnLine 50, 1 (2017), 11800–11807. https://doi.org/10.1016/j.ifacol.2017.08.1991 20th IFAC World Congress.
  5. GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration. In Advances in Neural Information Processing Systems.
  6. Batch Bayesian Optimization via Local Penalization. In Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, Vol. 51. PMLR, 648–657.
  7. Nikolaus Hansen. 2016. The CMA Evolution Strategy: A Tutorial. CoRR abs/1604.00772 (2016). arXiv:1604.00772
  8. CMA-ES/pycma on Github.
  9. Nikolaus Hansen and Anne Auger. 2014. Principled Design of Continuous Stochastic Search: From Theory to Practice. Springer Berlin Heidelberg, Berlin, Heidelberg, 145–180. https://doi.org/10.1007/978-3-642-33206-7_8
  10. Nikolaus Hansen and Andreas Ostermeier. 1996. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of IEEE International Conference on Evolutionary Computation. 312–317. https://doi.org/10.1109/ICEC.1996.542381
  11. Array programming with NumPy. Nature 585, 7825 (2020), 357–362. https://doi.org/10.1038/s41586-020-2649-2
  12. An evaluation of sequential model-based optimization for expensive blackbox functions. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation. Association for Computing Machinery, New York, NY, USA, 1209–1216. https://doi.org/10.1145/2464576.2501592
  13. Avoidance of constraint violation for experiment-based evolutionary multi-objective optimization. In 2009 IEEE Congress on Evolutionary Computation. 2756–2763. https://doi.org/10.1109/CEC.2009.4983288
  14. Are Evolutionary Algorithms Safe Optimizers?. In Proceedings of the Genetic and Evolutionary Computation Conference. Association for Computing Machinery, 814–822. https://doi.org/10.1145/3512290.3528818
  15. Safe Learning and Optimization Techniques: Towards a Survey of the State of the Art. In Trustworthy AI - Integrating Learning, Optimization and Reasoning. Springer International Publishing, Cham, 123–139.
  16. Dong C Liu and Jorge Nocedal. 1989. On the limited memory BFGS method for large scale optimization. Mathematical Programming 45, 1 (1989), 503–528. https://doi.org/10.1007/BF01589116
  17. Safe Reinforcement Learning for Automatic Insulin Delivery in Type I Diabetes. In Reinforcement Learning for Real Life Workshop, NeurIPS 2022.
  18. Learning soft task priorities for safe control of humanoid robots with constrained stochastic optimization. In 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids). 101–108. https://doi.org/10.1109/HUMANOIDS.2016.7803261
  19. Masahiro Nomura and Masashi Shibata. 2024. cmaes : A Simple yet Practical Python Library for CMA-ES. arXiv preprint arXiv:2402.01373 (2024).
  20. Gaussian Processes for Machine Learning. Vol. 1. Springer.
  21. Moncef Gabbouj Serkan Kiranyaz, Turker Ince. 2014. Multidimensional Particle Swarm Optimization for Machine Learning and Pattern Recognition (first ed.). Springer. https://doi.org/10.1007/978-3-642-37846-1
  22. Safe Exploration for Optimization with Gaussian Processes. In Proceedings of the 32nd International Conference on Machine Learning (Proceedings of Machine Learning Research, Vol. 37). PMLR, 997–1005.
  23. Lauchlan Toal and Dirk V. Arnold. 2020. Simple Surrogate Model Assisted Optimization with Covariance Matrix Adaptation. In Parallel Problem Solving from Nature – PPSN XVI. Springer International Publishing, 184–197.
  24. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods 17 (2020), 261–272. https://doi.org/10.1038/s41592-019-0686-2
  25. Surrogate-Assisted (1+1)-CMA-ES with Switching Mechanism of Utility Functions. In Applications of Evolutionary Computation. Springer Nature Switzerland, 798–814.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.