Title
The Effect Of Reduced Training In Neural Architecture Search
Abstract
As neural architecture search becomes an increasingly studied field, it has become apparent that it demands a great number of computational resources. These are usually devoted to computations and utilized to train and evaluate intermediate solutions during the search phase. Although most researchers focus on developing more efficient search methods, the main computational cost in terms of execution time percentage concerns the evaluation of candidate architectures. As such, many works utilize a smaller number of training epochs during search phase evaluations. In this work, we study the effect of reduced training in neural architecture search. We focus on the retention of relative rankings between architectures when they are trained with different optimizers and for various epochs. We discover relatively high rank correlations between various fully and partially trained, arbitrarily connected architectures (Kendall's tau-b > 0.7). These are generated by mutating a simple convolutional architecture for the CIFAR-10 image recognition dataset. Furthermore, we observe similar behaviors in networks sampled from the NASBench neural architecture dataset, consisting of a fixed outer skeleton and variable cell module composition. Finally, we demonstrate the ability of genetic algorithms to find optimal solutions in noisy environments, by simulating the previous findings with perturbed n-dimensional Rastrigin functions.
Year
DOI
Venue
2020
10.1007/s00521-020-04915-6
NEURAL COMPUTING & APPLICATIONS
Keywords
DocType
Volume
Neural architecture search, Deep learning, Ranking, NASBench
Journal
32
Issue
ISSN
Citations 
23
0941-0643
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
George Kyriakides111.02
Konstantinos Margaritis293.26