Title
DropCircuit : A Modular Regularizer for Parallel Circuit Networks.
Abstract
How to design and train increasingly large neural network models is a topic that has been actively researched for several years. However, while there exists a large number of studies on training deeper and/or wider models, there is relatively little systematic research particularly on the effective usage of wide modular neural networks. Addressing this gap, and in an attempt to solve the problem of lengthy training times, we proposed Parallel Circuits (PCs), a biologically inspired architecture based on the design of the retina. In previous work we showed that this approach fails to maintain generalization performance in spite of achieving sharp speed gains. To address this issue, and motivated by the way dropout prevents node co-adaptation, in this paper, we suggest an improvement by extending dropout to the parallel-circuit architecture. The paper provides empirical proof and multiple insights into this combination. Experiments show promising results in which improved error rates are achieved in most cases, whilst maintaining the speed advantage of the PC approach.
Year
DOI
Venue
2018
https://doi.org/10.1007/s11063-017-9677-4
Neural Processing Letters
Keywords
Field
DocType
Parallel circuits,Deep learning,Dropout,DropCircuit
Architecture,Existential quantification,Computer science,Artificial intelligence,Modular design,Series and parallel circuits,Deep learning,Artificial neural network,Machine learning,Spite
Journal
Volume
Issue
ISSN
47
3
1370-4621
Citations 
PageRank 
References 
0
0.34
19
Authors
4
Name
Order
Citations
PageRank
Kien Tuong Phan121.07
T. H. Maul2176.41
Tuong Thuy Vu3375.47
Weng Kin Lai4547.98