Abstract | ||
---|---|---|
Any neuro-evolutionary algorithm that solves complex problems needs to deal with the issue of computational complexity. We show how a neural network (feed-forward, recurrent or RBF) can be transformed and then compiled in order to achieve fast execution speeds without requiring dedicated hardware like FPGAs. The compiled network uses a simple external data structure-a vector-for its parameters. This allows the weights of the neural network to be optimised by the evolutionary process without the need to re-compile the structure. In an experimental comparison our method effects a speedup of factor 5-10 compared to the standard method of evaluation (i.e., traversing a data structure with optimised C++ code). |
Year | DOI | Venue |
---|---|---|
2010 | 10.1109/IJCNN.2010.5596296 | 2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010 |
Keywords | Field | DocType |
genomics,network topology,neural nets,neural network,evolutionary computation,evolutionary algorithm,feed forward,data structures,data structure,topology,optimization,computational complexity,artificial neural networks,bioinformatics | Data structure,Computer science,Field-programmable gate array,Evolutionary computation,Network topology,Machine code,Artificial intelligence,Artificial neural network,Machine learning,Computational complexity theory,Speedup | Conference |
ISSN | Citations | PageRank |
2161-4393 | 1 | 0.37 |
References | Authors | |
7 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Nils T. Siebel | 1 | 144 | 12.06 |
Andreas Jordt | 2 | 79 | 6.02 |
Gerald Sommer | 3 | 1 | 0.37 |