Title
Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches
Abstract
This paper proposes a method to find the hyperparameter tuning for a deep neural network by using a univariate dynamic encoding algorithm for searches. Optimizing hyperparameters for such a neural network is difficult because the neural network that has several parameters to configure; furthermore, the training speed for such a network is slow. The proposed method was tested for two neural network models; an autoencoder and a convolution neural network with the Modified National Institute of Standards and Technology (MNIST) dataset. To optimize hyperparameters with the proposed method, the cost functions were selected as the average of the difference between the decoded value and the original image for the autoencoder, and the inverse of the evaluation accuracy for the convolution neural network. The hyperparameters were optimized using the proposed method with fast convergence speed and few computational resources, and the results were compared with those of the other considered optimization algorithms (namely, simulated annealing, genetic algorithm, and particle swarm algorithm) to show the effectiveness of the proposed methodology.
Year
DOI
Venue
2019
10.1016/j.knosys.2019.04.019
Knowledge-Based Systems
Keywords
Field
DocType
Hyperparameter optimization,Gradient-free optimization,Deep neural network,Convolution neural network,Autoencoder
Simulated annealing,Hyperparameter optimization,Data mining,Autoencoder,MNIST database,Pattern recognition,Hyperparameter,Convolutional neural network,Computer science,Artificial intelligence,Artificial neural network,Genetic algorithm
Journal
Volume
ISSN
Citations 
178
0950-7051
0
PageRank 
References 
Authors
0.34
0
1
Name
Order
Citations
PageRank
Y Yoo18424.92