Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Large Scale Neural Architecture Search with Polyharmonic Splines (2011.10608v1)

Published 20 Nov 2020 in cs.CV

Abstract: Neural Architecture Search (NAS) is a powerful tool to automatically design deep neural networks for many tasks, including image classification. Due to the significant computational burden of the search phase, most NAS methods have focused so far on small, balanced datasets. All attempts at conducting NAS at large scale have employed small proxy sets, and then transferred the learned architectures to larger datasets by replicating or stacking the searched cells. We propose a NAS method based on polyharmonic splines that can perform search directly on large scale, imbalanced target datasets. We demonstrate the effectiveness of our method on the ImageNet22K benchmark[16], which contains 14 million images distributed in a highly imbalanced manner over 21,841 categories. By exploring the search space of the ResNet [23] and Big-Little Net ResNext [11] architectures directly on ImageNet22K, our polyharmonic splines NAS method designed a model which achieved a top-1 accuracy of 40.03% on ImageNet22K, an absolute improvement of 3.13% over the state of the art with similar global batch size [15].

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Ulrich Finkler (10 papers)
  2. Michele Merler (10 papers)
  3. Rameswar Panda (79 papers)
  4. Mayoore S. Jaiswal (4 papers)
  5. Hui Wu (54 papers)
  6. Kandan Ramakrishnan (8 papers)
  7. Chun-Fu Chen (28 papers)
  8. Minsik Cho (36 papers)
  9. David Kung (11 papers)
  10. Rogerio Feris (105 papers)
  11. Bishwaranjan Bhattacharjee (18 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.