Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Automatic Hyper-Parameter Optimization Based on Mapping Discovery from Data to Hyper-Parameters (2003.01751v1)

Published 3 Mar 2020 in cs.LG and stat.ML

Abstract: Machine learning algorithms have made remarkable achievements in the field of artificial intelligence. However, most machine learning algorithms are sensitive to the hyper-parameters. Manually optimizing the hyper-parameters is a common method of hyper-parameter tuning. However, it is costly and empirically dependent. Automatic hyper-parameter optimization (autoHPO) is favored due to its effectiveness. However, current autoHPO methods are usually only effective for a certain type of problems, and the time cost is high. In this paper, we propose an efficient automatic parameter optimization approach, which is based on the mapping from data to the corresponding hyper-parameters. To describe such mapping, we propose a sophisticated network structure. To obtain such mapping, we develop effective network constrution algorithms. We also design strategy to optimize the result futher during the application of the mapping. Extensive experimental results demonstrate that the proposed approaches outperform the state-of-the-art apporaches significantly.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Bozhou Chen (6 papers)
  2. Kaixin Zhang (14 papers)
  3. Longshen Ou (9 papers)
  4. Chenmin Ba (2 papers)
  5. Hongzhi Wang (94 papers)
  6. Chunnan Wang (11 papers)
Citations (2)