Abstract | ||
---|---|---|
•An attention mechanism (the DAA block) is developed for feature enhancement.•Multi-branch factorization design enables low redundancy and high efficiency.•The DAA block introduces a small computational cost with large receptive fields.•The DAA block can be easily integrated with existing mobile networks.•Experiments on multiple vision tasks show the effectiveness of our method. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1016/j.patcog.2022.108899 | Pattern Recognition |
Keywords | DocType | Volume |
Efficient mobile networks,Attention mechanism,Feature enhancement,Multi-branch factorization,Multi-dimensional information | Journal | 131 |
Issue | ISSN | Citations |
1 | 0031-3203 | 0 |
PageRank | References | Authors |
0.34 | 21 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Rongyun Mo | 1 | 0 | 0.34 |
Shenqi Lai | 2 | 0 | 1.01 |
Yan Yan | 3 | 240 | 48.08 |
Zhenhua Chai | 4 | 12 | 6.59 |
Xiaolin Wei | 5 | 7 | 8.27 |