Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transfer Learning based Search Space Design for Hyperparameter Tuning (2206.02511v1)

Published 6 Jun 2022 in cs.LG and cs.AI

Abstract: The tuning of hyperparameters becomes increasingly important as ML models have been extensively applied in data mining applications. Among various approaches, Bayesian optimization (BO) is a successful methodology to tune hyper-parameters automatically. While traditional methods optimize each tuning task in isolation, there has been recent interest in speeding up BO by transferring knowledge across previous tasks. In this work, we introduce an automatic method to design the BO search space with the aid of tuning history from past tasks. This simple yet effective approach can be used to endow many existing BO methods with transfer learning capabilities. In addition, it enjoys the three advantages: universality, generality, and safeness. The extensive experiments show that our approach considerably boosts BO by designing a promising and compact search space instead of using the entire space, and outperforms the state-of-the-arts on a wide range of benchmarks, including machine learning and deep learning tuning tasks, and neural architecture search.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yang Li (1140 papers)
  2. Yu Shen (56 papers)
  3. Huaijun Jiang (8 papers)
  4. Tianyi Bai (26 papers)
  5. Wentao Zhang (261 papers)
  6. Ce Zhang (215 papers)
  7. Bin Cui (165 papers)
Citations (12)