Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization (2105.01015v1)

Published 3 May 2021 in cs.LG, cs.AI, and stat.ML

Abstract: Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Julia Guerrero-Viu (8 papers)
  2. Sven Hauns (1 paper)
  3. Sergio Izquierdo (8 papers)
  4. Guilherme Miotto (1 paper)
  5. Simon Schrodi (10 papers)
  6. Thomas Elsken (11 papers)
  7. Difan Deng (10 papers)
  8. Marius Lindauer (71 papers)
  9. Frank Hutter (177 papers)
  10. Andre Biedenkapp (2 papers)
Citations (24)

Summary

We haven't generated a summary for this paper yet.