Title
Incremental Training of Deep Convolutional Neural Networks.
Abstract
We propose an incremental training method that partitions the original network into sub-networks, which are then gradually incorporated in the running network during the training process. To allow for a smooth dynamic growth of the network, we introduce a look-ahead initialization that outperforms the random initialization. We demonstrate that our incremental approach reaches the reference network baseline accuracy. Additionally, it allows to identify smaller partitions of the original state-of-the-art network, that deliver the same final accuracy, by using only a fraction of the global number of parameters. This allows for a potential speedup of the training time of several factors. We report training results on CIFAR-10 for ResNet and VGGNet.
Year
Venue
DocType
2018
AutoML@PKDD/ECML
Journal
Volume
ISSN
Citations 
abs/1803.10232
http://ceur-ws.org/Vol-1998
4
PageRank 
References 
Authors
0.38
8
4
Name
Order
Citations
PageRank
Roxana Istrate1162.96
A Cristiano I Malossi2659.29
Costas Bekas38112.75
Dimitrios S. Nikolopoulos41469128.40