Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Linear Embedding-based High-dimensional Batch Bayesian Optimization without Reconstruction Mappings (2211.00947v1)

Published 2 Nov 2022 in stat.ML and cs.LG

Abstract: The optimization of high-dimensional black-box functions is a challenging problem. When a low-dimensional linear embedding structure can be assumed, existing Bayesian optimization (BO) methods often transform the original problem into optimization in a low-dimensional space. They exploit the low-dimensional structure and reduce the computational burden. However, we reveal that this approach could be limited or inefficient in exploring the high-dimensional space mainly due to the biased reconstruction of the high-dimensional queries from the low-dimensional queries. In this paper, we investigate a simple alternative approach: tackling the problem in the original high-dimensional space using the information from the learned low-dimensional structure. We provide a theoretical analysis of the exploration ability. Furthermore, we show that our method is applicable to batch optimization problems with thousands of dimensions without any computational difficulty. We demonstrate the effectiveness of our method on high-dimensional benchmarks and a real-world function.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Shuhei A. Horiguchi (6 papers)
  2. Tomoharu Iwata (64 papers)
  3. Taku Tsuzuki (1 paper)
  4. Yosuke Ozawa (1 paper)
Citations (1)

Summary

We haven't generated a summary for this paper yet.