Title
Sparse Codebook Model of Local Structures for Retrieval of Focal Liver Lesions Using Multiphase Medical Images.
Abstract
Characterization and individual trait analysis of the focal liver lesions (FLL) is a challenging task in medical image processing and clinical site. The character analysis of a unconfirmed FLL case would be expected to benefit greatly from the accumulated FLL cases with expertsź analysis, which can be achieved by content-based medical image retrieval (CBMIR). CBMIR mainly includes discriminated feature extraction and similarity calculation procedures. Bag-of-Visual-Words (BoVW) (codebook-based model) has been proven to be effective for different classification and retrieval tasks. This study investigates an improved codebook model for the fined-grained medical image representation with the following three advantages: (1) instead of SIFT, we exploit the local patch (structure) as the local descriptor, which can retain all detailed information and is more suitable for the fine-grained medical image applications; (2) in order to more accurately approximate any local descriptor in coding procedure, the sparse coding method, instead of K-means algorithm, is employed for codebook learning and coded vector calculation; (3) we evaluate retrieval performance of focal liver lesions (FLL) using multiphase computed tomography (CT) scans, in which the proposed codebook model is separately learned for each phase. The effectiveness of the proposed method is confirmed by our experiments on FLL retrieval.
Year
DOI
Venue
2017
10.1155/2017/1413297
Int. J. Biomedical Imaging
Field
DocType
Volume
Computer vision,Scale-invariant feature transform,Neural coding,Computer science,Image representation,Image retrieval,Image processing,Coding (social sciences),Feature extraction,Artificial intelligence,Codebook
Journal
2017
ISSN
Citations 
PageRank 
1687-4188
2
0.41
References 
Authors
11
7
Name
Order
Citations
PageRank
Jian Wang1284.20
Xian-Hua Han21410.19
Yingying Xu383.62
Lanfen Lin47824.70
Hongjie Hu54110.45
Chongwu Jin640.77
Yen-Wei Chen7720155.73