Papers
Topics
Authors
Recent
Search
2000 character limit reached

Transferable Neural Processes for Hyperparameter Optimization

Published 7 Sep 2019 in cs.LG, cs.AI, and stat.ML | (1909.03209v2)

Abstract: Automated machine learning aims to automate the whole process of machine learning, including model configuration. In this paper, we focus on automated hyperparameter optimization (HPO) based on sequential model-based optimization (SMBO). Though conventional SMBO algorithms work well when abundant HPO trials are available, they are far from satisfactory in practical applications where a trial on a huge dataset may be so costly that an optimal hyperparameter configuration is expected to return in as few trials as possible. Observing that human experts draw on their expertise in a machine learning model by trying configurations that once performed well on other datasets, we are inspired to speed up HPO by transferring knowledge from historical HPO trials on other datasets. We propose an end-to-end and efficient HPO algorithm named as Transfer Neural Processes (TNP), which achieves transfer learning by incorporating trials on other datasets, initializing the model with well-generalized parameters, and learning an initial set of hyperparameters to evaluate. Experiments on extensive OpenML datasets and three computer vision datasets show that the proposed model can achieve state-of-the-art performance in at least one order of magnitude less trials.

Citations (7)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.