Title
Multi-Objective Hyperparameter Optimization for Spiking Neural Network Neuroevolution
Abstract
Neuroevolution has had significant success over recent years, but there has been relatively little work applying neuroevolution approaches to spiking neural networks (SNNs). SNNs are a type of neural networks that include temporal processing component, are not easily trained using other methods because of their lack of differentiable activation functions, and can be deployed into energy-efficient neuromorphic hardware. In this work, we investigate two evolutionary approaches for training SNNs. We explore the impact of the hyperparameters of the evolutionary approaches, including tournament size, population size, and representation type, on the performance of the algorithms. We present a multi-objective Bayesian-based hyperparameter optimization approach to tune the hyperparameters to produce the most accurate and smallest SNNs. We show that the hyperparameters can significantly affect the performance of these algorithms. We also perform sensitivity analysis and demonstrate that every hyperparameter value has the potential to perform well, assuming other hyperparameter values are set correctly.
Year
DOI
Venue
2021
10.1109/CEC45853.2021.9504897
2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021)
Keywords
DocType
Citations 
spiking neural networks, neuromorphic computing, evolutionary algorithms
Conference
0
PageRank 
References 
Authors
0.34
0
6