Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Nonmyopic Bayesian Optimization via One-Shot Multi-Step Trees (2006.15779v1)

Published 29 Jun 2020 in cs.LG, cs.NA, math.NA, and stat.ML

Abstract: Bayesian optimization is a sequential decision making framework for optimizing expensive-to-evaluate black-box functions. Computing a full lookahead policy amounts to solving a highly intractable stochastic dynamic program. Myopic approaches, such as expected improvement, are often adopted in practice, but they ignore the long-term impact of the immediate decision. Existing nonmyopic approaches are mostly heuristic and/or computationally expensive. In this paper, we provide the first efficient implementation of general multi-step lookahead Bayesian optimization, formulated as a sequence of nested optimization problems within a multi-step scenario tree. Instead of solving these problems in a nested way, we equivalently optimize all decision variables in the full tree jointly, in a one-shot'' fashion. Combining this with an efficient method for implementing multi-step Gaussian processfantasization,'' we demonstrate that multi-step expected improvement is computationally tractable and exhibits performance superior to existing methods on a wide range of benchmarks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shali Jiang (6 papers)
  2. Daniel R. Jiang (17 papers)
  3. Maximilian Balandat (27 papers)
  4. Brian Karrer (41 papers)
  5. Roman Garnett (38 papers)
  6. Jacob R. Gardner (39 papers)
Citations (40)