Title
Transformer Lesion Tracker
Abstract
Evaluating lesion progression and treatment response via longitudinal lesion tracking plays a critical role in clinical practice. Automated approaches for this task are motivated by prohibitive labor costs and time consumption when lesion matching is done manually. Previous methods typically lack the integration of local and global information. In this work, we propose a transformer-based approach, termed Transformer Lesion Tracker (TLT). Specifically, we design a Cross Attention-based Transformer (CAT) to capture and combine both global and local information to enhance feature extraction. We also develop a Registration-based Anatomical Attention Module (RAAM) to introduce anatomical information to CAT so that it can focus on useful feature knowledge. A Sparse Selection Strategy (SSS) is presented for selecting features and reducing memory footprint in Transformer training. In addition, we use a global regression to further improve model performance. We conduct experiments on a public dataset to show the superiority of our method and find that our model performance has improved the average Euclidean center error by at least 14.3% (6 mm vs 7 mm) compared with the state-of-the-art (SOTA). Code is available at https://github. com/TangWen920812/TLT.
Year
DOI
Venue
2022
10.1007/978-3-031-16446-0_19
MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT VI
Keywords
DocType
Volume
Transformer, Cross attention, Registration
Conference
13436
ISSN
Citations 
PageRank 
0302-9743
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Wen Tang101.01
Han Kang201.01
Haoyue Zhang300.68
Pengxin Yu401.01
Corey W. Arnold500.68
Rongguo Zhang601.01