Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Evolutionary Neural AutoML for Deep Learning (1902.06827v3)

Published 18 Feb 2019 in cs.NE

Abstract: Deep neural networks (DNNs) have produced state-of-the-art results in many benchmarks and problem domains. However, the success of DNNs depends on the proper configuration of its architecture and hyperparameters. Such a configuration is difficult and as a result, DNNs are often not used to their full potential. In addition, DNNs in commercial applications often need to satisfy real-world design constraints such as size or number of parameters. To make configuration easier, automatic machine learning (AutoML) systems for deep learning have been developed, focusing mostly on optimization of hyperparameters. This paper takes AutoML a step further. It introduces an evolutionary AutoML framework called LEAF that not only optimizes hyperparameters but also network architectures and the size of the network. LEAF makes use of both state-of-the-art evolutionary algorithms (EAs) and distributed computing frameworks. Experimental results on medical image classification and natural language analysis show that the framework can be used to achieve state-of-the-art performance. In particular, LEAF demonstrates that architecture optimization provides a significant boost over hyperparameter optimization, and that networks can be minimized at the same time with little drop in performance. LEAF therefore forms a foundation for democratizing and improving AI, as well as making AI practical in future applications.

Citations (103)

Summary

  • The paper introduces LEAF, a novel evolutionary AutoML framework optimizing both network architectures and hyperparameters simultaneously.
  • It employs an adapted CoDeepNEAT algorithm to enhance search space exploration and simplify the design of complex deep networks.
  • LEAF achieves state-of-the-art performance on tasks like toxicity and image classification while reducing model complexity and parameter count.

Analysis of "Evolutionary Neural AutoML for Deep Learning"

The paper "Evolutionary Neural AutoML for Deep Learning" by Liang, Meyerson, Hodjat, Fink, Mutch, and Miikkulainen introduces an advanced AutoML framework named LEAF. The framework applies evolutionary algorithms to automate the configuration of deep neural networks (DNNs), optimizing both their architectural structures and hyperparameters. This dual optimization approach is crucial in exploiting the full capabilities of DNNs, addressing the challenging problem of model design in deep learning.

The LEAF framework employs a novel adaptation of the CoDeepNEAT evolutionary algorithm to evolve both the hyperparameters and the network architectures. CoDeepNEAT significantly enhances the search space exploration due to its capability to evolve complex multidimensional architectures and hyperparameters concurrently. Unlike traditional methods that handle these aspects sequentially, LEAF's approach allows for the derivation of optimized networks without requiring domain expertise. This is particularly advantageous given the intricacies involved in designing neural architectures that meet specific performance criteria while adhering to constraints such as computational efficiency and memory usage.

The paper demonstrates LEAF's efficacy through empirical results across two real-world tasks: Wikipedia comment toxicity classification and Chest X-ray multitask image classification. The experiments underscore LEAF's superior performance in discovering state-of-the-art neural networks that outperform existing automated systems like Google AutoML and conventional manual designs. Specifically, LEAF's evolutionary approach showed clear improvements in architectures' performance metrics and computational efficiency, as evidenced by its ability to achieve competitive results with fewer parameters.

Key insights from the experimental results reveal that LEAF not only excels in performance optimization but also facilitates model complexity minimization via multiobjective optimization. This capability is essential for deploying AI models on resource-constrained devices, contributing towards democratizing access to AI technology by allowing broader adoption in practical applications.

The introduction of LEAF also sets the stage for several speculative future developments in AI. One such development is the enhanced capability of automated systems to adaptively incorporate domain-specific constraints into the evolutionary process, which would further optimize resource utilization without sacrificing performance. Moreover, the system’s capability to handle multitask learning enhances its applicability over various domains, potentially leading to new benchmarks in AI applications.

Concluding, the paper makes a significant contribution to the field of AutoML by presenting a robust framework that advances neural architecture search through evolutionary techniques. LEAF not only meets current needs by automating complex model designs but also lays the foundation for future advancements in AI that leverage its evolutionary approach. With continued improvements, this framework could offer even more efficient solutions that are easily adaptable across various machine learning challenges.

Youtube Logo Streamline Icon: https://streamlinehq.com