Title
Dimension-aware attention for efficient mobile networks
Abstract
•An attention mechanism (the DAA block) is developed for feature enhancement.•Multi-branch factorization design enables low redundancy and high efficiency.•The DAA block introduces a small computational cost with large receptive fields.•The DAA block can be easily integrated with existing mobile networks.•Experiments on multiple vision tasks show the effectiveness of our method.
Year
DOI
Venue
2022
10.1016/j.patcog.2022.108899
Pattern Recognition
Keywords
DocType
Volume
Efficient mobile networks,Attention mechanism,Feature enhancement,Multi-branch factorization,Multi-dimensional information
Journal
131
Issue
ISSN
Citations 
1
0031-3203
0
PageRank 
References 
Authors
0.34
21
5
Name
Order
Citations
PageRank
Rongyun Mo100.34
Shenqi Lai201.01
Yan Yan324048.08
Zhenhua Chai4126.59
Xiaolin Wei578.27