Title
CircConv: A Structured Convolution with Low Complexity.
Abstract
Deep neural networks (DNNs), especially deep convolutional neural networks (CNNs), have emerged as the powerful technique in various machine learning applications. However, the large model sizes of DNNs yield high demands on computation resource and weight storage, thereby limiting the practical deployment of DNNs. To overcome these limitations, this paper proposes to impose the circulant structure to the construction of convolutional layers, and hence leads to circulant convolutional layers (CircConvs) and circulant CNNs. The circulant structure and models can be either trained from scratch or re-trained from a pre-trained non-circulant model, thereby making it very flexible for different training environments. Through extensive experiments, such strong structure-imposing approach is proved to be able to substantially reduce the number of parameters of convolutional layers and enable significant saving of computational cost by using fast multiplication of the circulant tensor.
Year
Venue
Field
2019
national conference on artificial intelligence
Scratch,Tensor,Convolutional neural network,Computer science,Convolution,Circulant matrix,Multiplication,Artificial intelligence,Computer engineering,Limiting,Machine learning,Computation
DocType
Volume
ISSN
Journal
abs/1902.11268
Published in AAAI 2019
Citations 
PageRank 
References 
0
0.34
20
Authors
6
Name
Order
Citations
PageRank
Siyu Liao1418.73
zhe li2827.50
liang zhao3136.42
Qinru Qiu41120102.58
Yanzhi Wang51082136.11
Bo Yuan626228.64