Abstract | ||
---|---|---|
In this paper we study the effect of target set size on transfer learning in deep learning convolutional neural networks. This is an important problem as labelling is a costly task, or for new or specific classes the number of labelled instances available may simply be too small. We present results for a series of experiments where we either train on a target of classes from scratch, retrain all layers, or subsequently lock more layers in the network, for the Tiny-ImageNet and MiniPlaces2 data sets. Our findings indicate that for smaller target data sets freezing the weights for the initial layers of the network gives better results on the target set classes. We present a simple and easy to implement training heuristic based on these findings. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1007/978-3-319-46349-0_5 | ADVANCES IN INTELLIGENT DATA ANALYSIS XV |
Keywords | Field | DocType |
Deep learning,Convolutional neural networks,Transfer learning,Learning curves,AlexNet | Competitive learning,Data set,Heuristic,Pattern recognition,Computer science,Convolutional neural network,Transfer of learning,Deep belief network,Artificial intelligence,Deep learning,Learning curve,Machine learning | Conference |
Volume | ISSN | Citations |
9897 | 0302-9743 | 4 |
PageRank | References | Authors |
0.47 | 11 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Deepak Soekhoe | 1 | 4 | 0.47 |
Peter van der Putten | 2 | 5 | 1.18 |
Aske Plaat | 3 | 524 | 72.18 |