Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Knowing The What But Not The Where in Bayesian Optimization (1905.02685v5)

Published 7 May 2019 in stat.ML and cs.LG

Abstract: Bayesian optimization has demonstrated impressive success in finding the optimum input x* and output f* = f(x*) = max f(x) of a black-box function f. In some applications, however, the optimum output f* is known in advance and the goal is to find the corresponding optimum input x*. In this paper, we consider a new setting in BO in which the knowledge of the optimum output f* is available. Our goal is to exploit the knowledge about f* to search for the input x* efficiently. To achieve this goal, we first transform the Gaussian process surrogate using the information about the optimum output. Then, we propose two acquisition functions, called confidence bound minimization and expected regret minimization. We show that our approaches work intuitively and give quantitatively better performance against standard BO methods. We demonstrate real applications in tuning a deep reinforcement learning algorithm on the CartPole problem and XGBoost on Skin Segmentation dataset in which the optimum values are publicly available.

Citations (33)

Summary

We haven't generated a summary for this paper yet.