Title
Evolution of Abstraction Across Layers in Deep Learning Neural Networks.
Abstract
Deep learning neural networks produce excellent results in various pattern recognition tasks. It is of great practical importance to answer some open questions regarding model design and parameterization, and to understand how input data are converted into meaningful knowledge at the output. The layer-by-layer evolution of the abstraction level has been proposed previously as a quantitative measure to describe the emergence of knowledge in the network. In this work we systematically evaluate the abstraction level for a variety of image datasets. We observe that there is a general tendency of increasing abstraction from input to output with the exception of a drop of abstraction at some ReLu and Pooling layers. The abstraction level is relatively low and does not change significantly in the first few layers following the input, while it fluctuates around some high saturation value at the layers preceding the output. Finally, the layer-by-layer change in abstraction is not normally distributed, rather it approximates an exponential distribution. These results point to salient local features of deep layers impacting overall (global) classification performance. We compare the results extracted from deep learning neural networks performing image processing tasks with the results obtained by analyzing brain imaging data. Our conclusions may be helpful in future designs of more efficient, compact deep learning neural networks.
Year
DOI
Venue
2018
10.1016/j.procs.2018.10.520
Procedia Computer Science
Keywords
Field
DocType
Deep Learning,Convolutional Neural Networks,Abstraction Level,Image Processing,Knowledge
Data mining,Abstraction,Computer science,Pooling,Image processing,Exponential distribution,Artificial intelligence,Deep learning,Artificial neural network,Abstraction layer,Machine learning,Salient
Conference
Volume
ISSN
Citations 
144
1877-0509
0
PageRank 
References 
Authors
0.34
2
3
Name
Order
Citations
PageRank
Robert Kozma12110.20
Roman Ilin29413.23
Hava T. Siegelmann3980145.09