Title
Dense Capsule Networks With Fewer Parameters
Abstract
The capsule network (CapsNet) is a promising model in computer vision. It has achieved excellent results on MNIST, but it is still slightly insufficient in real images. Deepening capsule architectures is an effective way to improve performance, but the computational cost hinders their development. To overcome parameter growth and build an efficient architecture, this paper proposes a tensor capsule layer based on multistage separable convolutions and a dense capsule architecture. Multistage separable convolutions can effectively reduce the parameters at the cost of a small performance loss. In the dense capsule architecture, the use of dense connections allows the capsule network to be deeper and easier to train. Combining these two can achieve a novel lightweight dense capsule network. Experiments show that this network uses only 0.05% of the parameters of the CapsNet, but the performance is improved by 8.25% on CIFAR10. In addition, the full tensor capsule method is proposed to solve the problem of capsule network parameters changing with image scale. Experiments prove that this method can keep the parameters unchanged while affecting the performance in a small amount. In order to lighten the fully connected capsule layer, a dynamic routing based on separable matrices is proposed. In addition to applying it to our models, this algorithm also compresses the CapsNet by 41.25% while losing only 0.47% performance on CIFAR10. The parameter utilization index is proposed to quantify the relationship between parameters and performance. To our knowledge, this is the first paper to study lightweight capsule network.
Year
DOI
Venue
2021
10.1007/s00500-021-05774-6
SOFT COMPUTING
Keywords
DocType
Volume
Capsule network, Neural network, Convolution network, Image classification
Journal
25
Issue
ISSN
Citations 
10
1432-7643
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Sun Kun155952.07
Xian-Bin Wen25516.67
Liming Yuan302.70
Haixia Xu403.04