Title
DenseNet-DC: Optimizing DenseNet Parameters Through Feature Map Generation Control.
Abstract
Convolutional Neural Networks still suffer from the need for great computational power, often restricting their use on various platforms. Therefore, we propose a new optimization method made for DenseNet, a convolutional neural network that has the characteristic of being completely connected. The objective of the method is to control the generation of the characteristic maps in relation to the moment the network is in, aiming to reduce the size of the network with the minimum of loss in accuracy. This control occurs reducing the number of feature maps through the addition of a new parameter called the Decrease Control or dc value, where the decrease occurs from half of the layers. In order to validate the behavior of the proposed model, experiments were performed using different image bases: MNIST, Fashion-MNIST, CIFAR-10, CIFAR-100, CALTECH-101, Cats vs Dogs and TinyImageNet. Some of the results achieved were: for the MNIST and Fashion-MNIST base, there was 43% parameter reduction. For the CIFAR-10 base achieved a 44% reduction in network parameters, while in base CIFAR-100 the parameter reduction are 43%. In the CALTECH-101 base the parameter optimization was 35%, while the Cats vs Dogs optimized 30% of model parameters. Finally, the TinyImageNet base was reduced 31% of the parameters.
Year
DOI
Venue
2020
10.22456/2175-2745.98369
RITA
DocType
Volume
Issue
Journal
27
3
Citations 
PageRank 
References 
0
0.34
0
Authors
2
Name
Order
Citations
PageRank
André Tavares da Silva100.34
Cristiano Roberto Siebert200.34