Papers
Topics
Authors
Recent
Search
2000 character limit reached

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

Published 20 Jul 2020 in cs.CV, cs.LG, and cs.NE | (2007.10396v1)

Abstract: In this paper, we propose an efficient NAS algorithm for generating task-specific models that are competitive under multiple competing objectives. It comprises of two surrogates, one at the architecture level to improve sample efficiency and one at the weights level, through a supernet, to improve gradient descent training efficiency. On standard benchmark datasets (C10, C100, ImageNet), the resulting models, dubbed NSGANetV2, either match or outperform models from existing approaches with the search being orders of magnitude more sample efficient. Furthermore, we demonstrate the effectiveness and versatility of the proposed method on six diverse non-standard datasets, e.g. STL-10, Flowers102, Oxford Pets, FGVC Aircrafts etc. In all cases, NSGANetV2s improve the state-of-the-art (under mobile setting), suggesting that NAS can be a viable alternative to conventional transfer learning approaches in handling diverse scenarios such as small-scale or fine-grained datasets. Code is available at https://github.com/mikelzc1990/nsganetv2

Citations (140)

Summary

  • The paper introduces NSGANetV2, a NAS approach that employs evolutionary multi-objective optimization with surrogate models to reduce sampling and computational costs.
  • It utilizes online learning surrogate models at both architecture and weight levels to efficiently predict model performance without exhaustive sampling.
  • Empirical evaluations on ImageNet and other datasets demonstrate that NSGANetV2 matches or exceeds state-of-the-art mobile architectures while significantly lowering GPU days.

This essay critically examines the paper on NSGANetV2, a proposed methodology for efficient Neural Architecture Search (NAS) using an evolutionary, surrogate-assisted approach to optimize for multiple objectives. The authors strive to address the computational complexity inherent in traditional NAS methods, particularly when applied to generating task-specific models across diverse datasets.

Methodological Approach

The paper introduces NSGANetV2 by leveraging an evolutionary multi-objective approach, emphasizing surrogate models at both the architecture and weight levels to enhance both sample and computation efficiencies. The architecture-level surrogate is designed to improve sample efficiency by predicting top-1 accuracy from encoded architectures using models such as Multi-Layer Perceptrons (MLP) and Gaussian Processes (GP). The weight-level surrogate utilizes the concept of a supernet for efficient weight optimization, thereby mitigating the need for exhaustive gradient descent processes traditionally required in lower-level NAS tasks.

A significant methodological innovation lies in employing an online learning approach to surrogate modeling, focusing on architectures proximal to the current Pareto frontier. This contrasts sharply with offline surrogate models, which often require extensive computational resources to sample and evaluate thousands of architectures. By choosing to sample just 350 architectures on ImageNet, the proposed approach demonstrates substantial improvements in sampling efficiency compared to previous methodologies, such as OnceForAll and PNAS, which require tens of thousands of samples.

Empirical Evaluation

The empirical evaluation of NSGANetV2 spans several standard and non-standard datasets, such as CIFAR-10, CIFAR-100, ImageNet, and others like STL-10 and Oxford Flowers102. The reported results are noteworthy, demonstrating that the NSGANetV2 models either match or surpass the performance of existing state-of-the-art architectures under mobile settings. Specifically, the NSGANetV2 approach stands out for its ability to design architectures that are both highly performant and computationally efficient with significantly lower search costs measured in GPU days.

The methodology's versatility is explicitly showcased through successful application in diverse non-standard datasets, suggesting that NAS may serve as a practical alternative to conventional transfer learning, especially with small-scale or fine-grained datasets.

Implications and Future Directions

The findings of this research open new avenues for practical and theoretical implications in the domain of neural architecture search. Practically, the alleviation of computational burdens presents NAS as a viable option for scenarios necessitating high-performance, customized neural networks with constrained computational resources. Theoretically, the adoption of evolutionary multi-objective optimization with surrogate-assisted processes encourages exploration into more sophisticated models and refinement techniques, potentially catalyzing further advancements in the domain.

One notable potential future development is enhancing surrogate model accuracy and efficiency, facilitating further reductions in the necessary sample size for architecture evaluations. Additionally, exploring the synergy between surrogate-assisted NAS and transfer learning strategies may yield further reductions in model training times and computational requirements, leading to new paradigms in hyper-efficient architecture design.

In summary, NSGANetV2 represents a notable contribution to the field of neural architecture search by delineating a strategy that balances efficiency with high model accuracy across multiple objectives and datasets. Its success in various benchmarks serves as a testament to the potential of applying evolutionary and surrogate modeling techniques to the ever-evolving challenges of artificial intelligence.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.