Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Learning for Bayesian Optimization of Scientific Problems with High-Dimensional Structure (2104.11667v4)

Published 23 Apr 2021 in cs.LG, physics.app-ph, physics.chem-ph, physics.comp-ph, and physics.optics

Abstract: Bayesian optimization (BO) is a popular paradigm for global optimization of expensive black-box functions, but there are many domains where the function is not completely a black-box. The data may have some known structure (e.g. symmetries) and/or the data generation process may be a composite process that yields useful intermediate or auxiliary information in addition to the value of the optimization objective. However, surrogate models traditionally employed in BO, such as Gaussian Processes (GPs), scale poorly with dataset size and do not easily accommodate known structure. Instead, we use Bayesian neural networks, a class of scalable and flexible surrogate models with inductive biases, to extend BO to complex, structured problems with high dimensionality. We demonstrate BO on a number of realistic problems in physics and chemistry, including topology optimization of photonic crystal materials using convolutional neural networks, and chemical property optimization of molecules using graph neural networks. On these complex tasks, we show that neural networks often outperform GPs as surrogate models for BO in terms of both sampling efficiency and computational cost.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Samuel Kim (24 papers)
  2. Peter Y. Lu (15 papers)
  3. Charlotte Loh (10 papers)
  4. Jamie Smith (9 papers)
  5. Jasper Snoek (42 papers)
  6. Marin Soljačić (141 papers)
Citations (14)