Title
Fast Evolutionary Neural Architecture Search Based on Bayesian Surrogate Model
Abstract
Neural Architecture Search (NAS) is studied to automatically design the deep neural network structure, freeing people from heavy network design tasks. Traditional NAS based on individual performance evaluation needs to train many networks generated by the search, and compare the performance of the networks according to their accuracy, which is very time-consuming. In this study, we propose to use a two-category comparator based random forest model as a surrogate to estimate the accuracy of the networks. thereby reducing heavy network training process and greatly saving search time. Instead of directly predicting the accuracy of each network, we propose to compare the relative performance between each two networks in our proposed two-category comparator. Furthermore, we implement the modeling process of the surrogate model in the sampling space of the original training data, which further accelerates the search process of the network in the NAS. Experimental results show that our proposed NAS framework can greatly reduce the search time, while the accuracy of the obtained network is comparable to that of other state-of-the art NAS algorithms.
Year
DOI
Venue
2021
10.1109/CEC45853.2021.9504999
2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021)
Keywords
DocType
Citations 
Bayesian Optimization, Surrogate-assisted Evolutionary Algorithm, Neural Architecture Search, Deep Neural Network
Conference
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Rui Shi100.34
Jianping Luo2346.76
Qiqi Liu3384.99