Title
Aggregation Cross-Entropy For Sequence Recognition
Abstract
In this paper, we propose a novel method, aggregation cross-entropy (ACE), for sequence recognition from a brand new perspective. The ACE loss function exhibits competitive performance to CTC and the attention mechanism, with much quicker implementation (as it involves only four fundamental formulas), faster inference\back-propagation (approximately O(1) in parallel), less storage requirement (no parameter and negligible runtime memory), and convenient employment (by replacing CTC with ACE). Furthermore, the proposed ACE loss function exhibits two noteworthy properties: (1) it can be directly applied for 2D prediction by flattening the 2D prediction into 1D prediction as the input and (2) it requires only characters and their numbers in the sequence annotation for supervision, which allows it to advance beyond sequence recognition, e.g., counting problem.
Year
DOI
Venue
2019
10.1109/CVPR.2019.00670
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019)
Field
DocType
Volume
Cross entropy,Flattening,Annotation,Pattern recognition,Computer science,Inference,Counting problem,Artificial intelligence
Journal
abs/1904.08364
ISSN
Citations 
PageRank 
1063-6919
7
0.42
References 
Authors
0
6
Name
Order
Citations
PageRank
zecheng xie1967.55
Yaoxiong Huang290.79
Yuanzhi Zhu3162.91
Lianwen Jin41337113.14
Yuliang Liu56613.22
Lele Xie6212.34