Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Batched Scalable Multi-Objective Bayesian Optimization Algorithm (1811.01323v1)

Published 4 Nov 2018 in cs.NE

Abstract: The surrogate-assisted optimization algorithm is a promising approach for solving expensive multi-objective optimization problems. However, most existing surrogate-assisted multi-objective optimization algorithms have three main drawbacks: 1) cannot scale well for solving problems with high dimensional decision space, 2) cannot incorporate available gradient information, and 3) do not support batch optimization. These drawbacks prevent their use for solving many real-world large scale optimization problems. This paper proposes a batched scalable multi-objective Bayesian optimization algorithm to tackle these issues. The proposed algorithm uses the Bayesian neural network as the scalable surrogate model. Powered with Monte Carlo dropout and Sobolov training, the model can be easily trained and can incorporate available gradient information. We also propose a novel batch hypervolume upper confidence bound acquisition function to support batch optimization. Experimental results on various benchmark problems and a real-world application demonstrate the efficiency of the proposed algorithm.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Xi Lin (135 papers)
  2. Hui-Ling Zhen (33 papers)
  3. Zhenhua Li (27 papers)
  4. Qingfu Zhang (78 papers)
  5. Sam Kwong (104 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.