Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Constrained Bayesian Optimization with Max-Value Entropy Search (1910.07003v1)

Published 15 Oct 2019 in stat.ML and cs.LG

Abstract: Bayesian optimization (BO) is a model-based approach to sequentially optimize expensive black-box functions, such as the validation error of a deep neural network with respect to its hyperparameters. In many real-world scenarios, the optimization is further subject to a priori unknown constraints. For example, training a deep network configuration may fail with an out-of-memory error when the model is too large. In this work, we focus on a general formulation of Gaussian process-based BO with continuous or binary constraints. We propose constrained Max-value Entropy Search (cMES), a novel information theoretic-based acquisition function implementing this formulation. We also revisit the validity of the factorized approximation adopted for rapid computation of the MES acquisition function, showing empirically that this leads to inaccurate results. On an extensive set of real-world constrained hyperparameter optimization problems we show that cMES compares favourably to prior work, while being simpler to implement and faster than other constrained extensions of Entropy Search.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Valerio Perrone (20 papers)
  2. Iaroslav Shcherbatyi (5 papers)
  3. Rodolphe Jenatton (41 papers)
  4. Cedric Archambeau (44 papers)
  5. Matthias Seeger (22 papers)
Citations (37)

Summary

We haven't generated a summary for this paper yet.