Title
Analysis of deep neural networks with the extended data Jacobian matrix
Abstract
Deep neural networks have achieved great success on a variety of machine learning tasks. There are many fundamental and open questions yet to be answered, however. We introduce the Extended Data Jacobian Matrix (EDJM) as an architecture-independent tool to analyze neural networks at the manifold of interest. The spectrum of the EDJM is found to be highly correlated with the complexity of the learned functions. After studying the effect of dropout, ensembles, and model distillation using EDJM, we propose a novel spectral regularization method, which improves network performance.
Year
Venue
DocType
2016
ICML'16 Proceedings of the 33rd International Conference on International Conference on Machine Learning - Volume 48
Conference
Citations 
PageRank 
References 
2
0.38
0
Authors
9
Name
Order
Citations
PageRank
Shengjie Wang153.11
Abdel-rahman Mohamed23772266.13
Rich Caruana34503655.71
Jeff Bilmes43420289.94
Matthai Philipose52705228.09
Matthew Richardson64655411.67
Krzysztof Geras7757.45
Gregor Urban8935.14
Özlem Aslan9362.50