Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The reparameterization trick for acquisition functions (1712.00424v1)

Published 1 Dec 2017 in stat.ML, cs.LG, and math.OC

Abstract: Bayesian optimization is a sample-efficient approach to solving global optimization problems. Along with a surrogate model, this approach relies on theoretically motivated value heuristics (acquisition functions) to guide the search process. Maximizing acquisition functions yields the best performance; unfortunately, this ideal is difficult to achieve since optimizing acquisition functions per se is frequently non-trivial. This statement is especially true in the parallel setting, where acquisition functions are routinely non-convex, high-dimensional, and intractable. Here, we demonstrate how many popular acquisition functions can be formulated as Gaussian integrals amenable to the reparameterization trick and, ensuingly, gradient-based optimization. Further, we use this reparameterized representation to derive an efficient Monte Carlo estimator for the upper confidence bound acquisition function in the context of parallel selection.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. James T. Wilson (8 papers)
  2. Riccardo Moriconi (9 papers)
  3. Frank Hutter (177 papers)
  4. Marc Peter Deisenroth (73 papers)
Citations (69)

Summary

We haven't generated a summary for this paper yet.