- The paper pioneers a multi-objective framework that jointly maximizes predictive accuracy while minimizing resource usage such as model size and computation.
- It applies Lamarckian evolution through approximate network morphisms to inherit and adjust beneficial architectural traits across generations.
- A two-stage sampling strategy efficiently explores sparsely populated regions of the Pareto front, yielding competitive CIFAR-10 results with reduced GPU usage.
Efficient Multi-objective Neural Architecture Search via Lamarckian Evolution
The paper presents a novel approach to Neural Architecture Search (NAS), emphasizing efficiency in computational resources and balancing multiple objectives using a method dubbed LEMONADE (Lamarckian Evolutionary algorithm for Multi-Objective Neural Architecture DEsign). This research addresses two primary limitations in existing NAS methods: the high computational demand and the narrow focus on maximizing predictive performance without considering resource usage.
Key Contributions
- Multi-objective Framework: LEMONADE tackles NAS as a multi-objective optimization problem, which contrasts with the prevalent single-objective approaches. It aims not only to maximize predictive accuracy but also to minimize resource consumption, such as the number of parameters, computation requirements, and inference time.
- Lamarckian Evolution: A distinctive feature of LEMONADE is its application of Lamarckian evolution principles using network morphisms. This technique maintains functionality across model generations, allowing inherited structures to span different architecture sizes and types. Approximate network morphisms are introduced to effectively shrink network size while preserving performance, which is essential for multi-objective search efficacy.
- Efficient Resource Utilization: LEMONADE employs a two-stage sampling strategy that prioritizes models in sparsely populated regions of the Pareto front, thus reducing the number of expensive evaluations. This is especially beneficial for leveraging cheap-to-evaluate objectives like model size, which helps in filtering candidate architectures before engaging in costly predictive performance evaluations.
Numerical Results
LEMONADE was evaluated on CIFAR-10 for multiple objectives, including predictive error and model size. The algorithm demonstrated competitive results, discovering architectures with validation errors as low as 3.6% using significantly fewer computational resources than traditional methods like NASNet. Notably, LEMONADE required 80 GPU days as opposed to NASNet’s 2000 GPU days for comparable performance.
Theoretical and Practical Implications
The introduction of a multi-objective approach in NAS allows designers to make informed trade-off decisions post hoc, selecting architectures tailored to specific application constraints. The Lamarckian inheritance mechanism further underscores the promise of transferability and adaptability of evolved architectures to diverse computational settings.
Future Directions
This paper opens avenues for integrating more complex evolutionary algorithms and extending the architecture search into more varied and larger topologies. Another interesting direction would be to incorporate sophisticated network compression techniques as operators within LEMONADE. Enhancing the weighted objective approach and defining dynamic trade-offs during the search process could also bolster the applicability of this framework across different domains.
In summary, LEMONADE advances the field of neural architecture search by effectively balancing the computational burden with multiple performance criteria. Its evolution-inspired methodology is particularly relevant for developing resource-conscious AI models, encouraging further exploration in this promising intersection of NAS and multi-objective optimization.