Abstract | ||
---|---|---|
Deep convolutional neural networks (CNNs) are successfully used in a number of applications. However, their storage and computational requirements have largely prevented their widespread use on mobile devices. Here we present a series of approaches for compressing and speeding up CNNs in the frequency domain, which focuses not only on smaller weights but on all the weights and their underlying connections. By treating convolutional filters as images, we decompose their representations in the frequency domain as common parts (i.e., cluster centers) shared by other similar filters and their individual private parts (i.e., individual residuals). A large number of low-energy frequency coefficients in both parts can be discarded to produce high compression without significantly compromising accuracy. Furthermore, we explore a data-driven method for removing redundancies in both spatial and frequency domains, which allows us to discard more useless weights by keeping similar accuracies. After obtaining the optimal sparse CNN in the frequency domain, we relax the computational burden of convolution operations in CNNs by linearly combining the convolution responses of discrete cosine transform (DCT) bases. The compression and speed-up ratios of the proposed algorithm are thoroughly analyzed and evaluated on benchmark image datasets to demonstrate its superiority over state-of-the-art methods. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/TPAMI.2018.2857824 | IEEE transactions on pattern analysis and machine intelligence |
Keywords | Field | DocType |
Convolution,Frequency-domain analysis,Discrete cosine transforms,Image coding,Redundancy,Convolutional neural networks,Mobile handsets | Frequency domain,Pattern recognition,Convolutional neural network,Convolution,Computer science,Discrete cosine transform,Image coding,Mobile device,Redundancy (engineering),Artificial intelligence,Discrete cosine transforms | Journal |
Volume | Issue | ISSN |
41 | 10 | 1939-3539 |
Citations | PageRank | References |
5 | 0.42 | 8 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yunhe Wang | 1 | 113 | 22.76 |
Chang Xu | 2 | 781 | 47.60 |
Chao Xu | 3 | 1327 | 62.65 |
Dacheng Tao | 4 | 19032 | 747.78 |