Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Design Editing for Offline Model-based Optimization (2405.13964v3)

Published 22 May 2024 in cs.LG and cs.CE

Abstract: Offline model-based optimization (MBO) aims to maximize a black-box objective function using only an offline dataset of designs and scores. These tasks span various domains, such as robotics, material design, and protein and molecular engineering. A common approach involves training a surrogate model using existing designs and their corresponding scores, and then generating new designs through gradient-based updates with respect to the surrogate model. This method suffers from the out-of-distribution issue, where the surrogate model may erroneously predict high scores for unseen designs. To address this challenge, we introduce a novel method, Design Editing for Offline Model-based Optimization} (DEMO), which leverages a diffusion prior to calibrate overly optimized designs. DEMO first generates pseudo design candidates by performing gradient ascent with respect to a surrogate model. Then, an editing process refines these pseudo design candidates by introducing noise and subsequently denoising them with a diffusion prior trained on the offline dataset, ensuring they align with the distribution of valid designs. We provide a theoretical proof that the difference between the final optimized designs generated by DEMO and the prior distribution of the offline dataset is controlled by the noise injected during the editing process. Empirical evaluations on seven offline MBO tasks show that DEMO outperforms various baseline methods, achieving the highest mean rank of 2.1 and a median rank of 1.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Ye Yuan (274 papers)
  2. Youyuan Zhang (8 papers)
  3. Can Chen (64 papers)
  4. Haolun Wu (27 papers)
  5. Zixuan Li (63 papers)
  6. Jianmo Li (1 paper)
  7. James J. Clark (32 papers)
  8. Xue Liu (156 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.