Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SCORE: A 1D Reparameterization Technique to Break Bayesian Optimization's Curse of Dimensionality (2406.12661v1)

Published 18 Jun 2024 in cs.LG and stat.ML

Abstract: Bayesian optimization (BO) has emerged as a powerful tool for navigating complex search spaces, showcasing practical applications in the fields of science and engineering.However, since it typically relies on a surrogate model to approximate the objective function, BO grapples with heightened computational costs that tend to escalate as the number of parameters and experiments grows. Several methods such as parallelization, surrogate model approximations, and memory pruning have been proposed to cut down computing time, but they all fall short of resolving the core issue behind BO's curse of dimensionality. In this paper, a 1D reparametrization trick is proposed to break this curse and sustain linear time complexity for BO in high-dimensional landscapes. This fast and scalable approach named SCORE can successfully find the global minimum of needle-in-a-haystack optimization functions and fit real-world data without the high-performance computing resources typically required by state-of-the-art techniques.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Joseph Chakar (1 paper)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets