Title | ||
---|---|---|
On the hardness of parameter optimization of convolution neural networks using genetic algorithm and machine learning. |
Abstract | ||
---|---|---|
We introduce a method for optimizing parameters in convolution neural network (CNN) using a genetic algorithm (GA). In the experiment, 11 CNN parameters were chosen and considered as one chromosome. We generated 150 datasets were created by arbitrarily changing the parameters. Among approximately 30 types of models with the highest cross validation, the dataset trained with a random forest model was used as the fitness function in our GA, and the optimized parameter was obtained. To improve the GA, we attempted to filter data and amplify training steps. The randomly revised parameters showed insignificant results, but the final 10 parameter sets showed 67.4% accuracy, which was 13.7% higher than that of the dataset obtained randomly. Among these, it showed a parameter that improved by 1.7% compared to that of the existing dataset.
|
Year | Venue | Field |
---|---|---|
2018 | GECCO (Companion) | Convolution,Computer science,Convolutional neural network,Fitness function,Artificial intelligence,Random forest,Artificial neural network,Cross-validation,Genetic algorithm,Machine learning |
DocType | ISBN | Citations |
Conference | 978-1-4503-5764-7 | 0 |
PageRank | References | Authors |
0.34 | 2 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hyeon-Chang Lee | 1 | 0 | 0.34 |
Dong-Pil Yu | 2 | 0 | 1.69 |
Yong-Hyuk Kim | 3 | 355 | 40.27 |