Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Constrained Efficient Global Optimization of Expensive Black-box Functions (2211.00162v3)

Published 31 Oct 2022 in math.OC

Abstract: We study the problem of constrained efficient global optimization, where both the objective and constraints are expensive black-box functions that can be learned with Gaussian processes. We propose CONFIG (CONstrained efFIcient Global Optimization), a simple and effective algorithm to solve it. Under certain regularity assumptions, we show that our algorithm enjoys the same cumulative regret bound as that in the unconstrained case and similar cumulative constraint violation upper bounds. For commonly used Matern and Squared Exponential kernels, our bounds are sublinear and allow us to derive a convergence rate to the optimal solution of the original constrained problem. In addition, our method naturally provides a scheme to declare infeasibility when the original black-box optimization problem is infeasible. Numerical experiments on sampled instances from the Gaussian process, artificial numerical problems, and a black-box building controller tuning problem all demonstrate the competitive performance of our algorithm. Compared to the other state-of-the-art methods, our algorithm significantly improves the theoretical guarantees, while achieving competitive empirical performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Wenjie Xu (29 papers)
  2. Yuning Jiang (106 papers)
  3. Bratislav Svetozarevic (16 papers)
  4. Colin N. Jones (88 papers)
Citations (2,044)

Summary

Constrained Efficient Global Optimization of Expensive Black-box Functions

The recently discussed paper presents an efficient algorithm, dubbed CONFIG, aimed at solving constrained global optimization problems characterized by expensive black-box functions, utilizing Gaussian processes (GPs). The paper positions itself within the niche of developing sample-efficient algorithms, crucial in scenarios where each function evaluation is resource-intensive, as is the case in hyperparameter tuning, control system optimization, and other related fields.

Problem Context

The problem addressed is the global optimization of black-box functions that are costly to evaluate and are subject to constraints that are similarly expensive. Traditional optimization methods, which often require numerous samples, are impractical due to the high resource cost associated with evaluations. Thus, Gaussian process-based optimization methods provide a promising alternative due to their intrinsic sample-efficiency.

Algorithm Proposal: CONFIG

CONFIG, the proposed algorithm, stands out in the constrained optimization landscape by:

  • Utilizing the principle of optimism in the face of uncertainty to determine sample points.
  • Solving alternate constrained optimization problems at each iteration with lower confidence bound (LCB) surrogates to select the next sample.

The primary theoretical contribution of the paper is demonstrating that CONFIG enjoys sub-linear cumulative regret bounds analogous to those in unconstrained optimization scenarios. Specifically, the paper confirms that the regret bound aligns with the maximal information gain when using Matérn and Squared Exponential kernels; these bounds are also preserved for cumulative constraint violations, thus ensuring convergence to the optimal solution.

Numerical Results

Numerical experiments substantiate CONFIG's effectiveness, demonstrating competitive performance against prevalent methods such as the Constrained Expected Improvement (CEI) and other state-of-the-art techniques. CONFIG distinctively embeds a mechanism to declare infeasibility if the optimization problem proves infeasible, a feature not commonly handled in existing literature.

Implications and Future Work

The practical implications of CONFIG are significant, particularly in applications where computational resources are limited, or the cost of failures is prohibitive. The inclusion of rigorous theoretical guarantees on cumulative regrets and violations presents a strong case for CONFIG's integration into real-world optimization processes.

Future research directions could explore:

  • Enhancing CONFIG's robustness when hyperparameters are misspecified.
  • Extending the method to tackle problems with larger input dimensions through adaptive, scalable strategies.
  • Investigating the efficiency of CONFIG in highly dynamic or time-varying environments.

Conclusion

The introduction of CONFIG offers a substantial advancement in constrained optimization methodologies within the domain of expensive, black-box functions. By combining an intuitive framework with robust theoretical backing and empirical validation, this algorithm introduces an advantageous tool for researchers and practitioners dealing with complex optimization challenges across various fields.