Title
A Scalable Algorithm For The Optimization Of Neural Network Architectures
Abstract
We propose a new scalable method to optimize the architecture of an artificial neural network. The proposed algorithm, called Greedy Search for Neural Network Architecture, aims to determine a neural network with minimal number of layers that is at least as performant as neural networks of the same structure identified by other hyperparameter search algorithms in terms of accuracy and computational cost. Numerical results performed on benchmark datasets show that, for these datasets, our method outperforms state-of-the-art hyperparameter optimization algorithms in terms of attainable predictive performance by the selected neural network architecture, and time-to-solution for the hyperparameter optimization to complete.
Year
DOI
Venue
2021
10.1016/j.parco.2021.102788
PARALLEL COMPUTING
Keywords
DocType
Volume
Deep learning, Hyperparameter optimization, Neural network architecture, Random search, Greedy constructive algorithms, Adaptive algorithms
Journal
104
ISSN
Citations 
PageRank 
0167-8191
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Massimiliano Lupo Pasini141.54
Junqi Yin201.01
Ying Wai Li300.34
Markus Eisenbach494.17