Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Heteroscedastic Treed Bayesian Optimisation (1410.7172v2)

Published 27 Oct 2014 in cs.LG, math.OC, and stat.ML

Abstract: Optimising black-box functions is important in many disciplines, such as tuning machine learning models, robotics, finance and mining exploration. Bayesian optimisation is a state-of-the-art technique for the global optimisation of black-box functions which are expensive to evaluate. At the core of this approach is a Gaussian process prior that captures our belief about the distribution over functions. However, in many cases a single Gaussian process is not flexible enough to capture non-stationarity in the objective function. Consequently, heteroscedasticity negatively affects performance of traditional Bayesian methods. In this paper, we propose a novel prior model with hierarchical parameter learning that tackles the problem of non-stationarity in Bayesian optimisation. Our results demonstrate substantial improvements in a wide range of applications, including automatic machine learning and mining exploration.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. John-Alexander M. Assael (3 papers)
  2. Ziyu Wang (137 papers)
  3. Bobak Shahriari (16 papers)
  4. Nando de Freitas (98 papers)
Citations (48)

Summary

We haven't generated a summary for this paper yet.