Title
SIMPLE DEEP LEARNING NETWORK VIA TENSOR-TRAIN HAAR-WAVELET DECOMPOSITION WITHOUT RETRAINING
Abstract
Deep neural network has revolutionized machine learning recently. However, it suffers from both high computation and memory cost such that deploying it on a hardware with limited resources (e.g., mobile devices) becomes a challenge. To address this problem, we propose a new technique, called Tensor-Train Haar-wavelet decomposition, that decomposes a large weight tensor from a fully-connected layer into a sequence of partial Haar-wavelet matrices without retraining. The novelty originates from the deterministic partial Haar-wavelet matrices such that we only need to store row indices instead of the whole matrix. Empirical results demonstrate that our method achieves efficient model compression while maintaining limited accuracy loss, even without retraining.
Year
DOI
Venue
2018
10.1109/MLSP.2018.8516987
2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)
Keywords
Field
DocType
high computation,memory cost,mobile devices,weight tensor,deterministic partial Haar-wavelet matrices,simple deep learning network,neural network,machine learning,tensor-train Haar-wavelet decomposition
Tensor,Pattern recognition,Matrix (mathematics),Computer science,Algorithm,Mobile device,Artificial intelligence,Haar wavelet,Novelty,Deep learning,Artificial neural network,Computation
Conference
ISSN
ISBN
Citations 
1551-2541
978-1-5386-5478-1
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Wei-Zhi Huang100.34
Sung-Hsien Hsieh24813.71
Chun-shien Lu31238104.71
Soo-Chang Pei42054241.11