Title
An Empirical Study on Improving the Speed and Generalization of Neural Networks Using a Parallel Circuit Approach.
Abstract
One of the common problems of neural networks, especially those with many layers, consists of their lengthy training time. We attempt to solve this problem at the algorithmic level, proposing a simple parallel design which is inspired by the parallel circuits found in the human retina. To avoid large matrix calculations, we split the original network vertically into parallel circuits and let the backpropagation algorithm flow in each subnetwork independently. Experimental results have shown the speed advantage of the proposed approach but also point out that this advantage is affected by multiple dependencies. The results also suggest that parallel circuits improve the generalization ability of neural networks presumably due to automatic problem decomposition. By studying network sparsity, we partly justified this theory and proposed a potential method for improving the design.
Year
DOI
Venue
2017
10.1007/s10766-016-0435-4
International Journal of Parallel Programming
Keywords
Field
DocType
Neural networks, Parallel circuits, Problem decomposition, Backpropagation, Sparsity
Potential method,Computer science,Matrix (mathematics),Algorithm,Theoretical computer science,Series and parallel circuits,Backpropagation,Artificial neural network,Subnetwork,Empirical research
Journal
Volume
Issue
ISSN
45
4
1573-7640
Citations 
PageRank 
References 
1
0.35
11
Authors
3
Name
Order
Citations
PageRank
Kien Tuong Phan121.07
T. H. Maul2176.41
Tuong Thuy Vu3375.47