Title
A shallow extraction of texture features for classification of abnormal video endoscopy frames
Abstract
Automated analysis of the gastric lesions in endoscopy videos is a challenging task and dynamics of the gastrointestinal environment make it even more difficult. In computer-aided diagnosis, gastric images are analyzed by visual descriptors. Various Deep Convolutional Neural Network (DCNN) models are available for representation learning and classification. In this paper, a computer aided diagnosis system is presented for the classification of abnormalities in Videos Endoscopy (VE) images based on Deep Gray-Level Co-occurrence Matrix (DeepGLCM) texture features. In our scheme, the convolutional layers of an already trained model are employed for acquisition of the statistical features from responses of filters to estimate the texture representation of VE frames. A learning model is trained on these features for gastric frames classification. The results obtained by using public datasets of endoscopy images to calculate the performance of the proposed method. In addition, we also use a private endoscopy dataset which is acquired from the University of Aveiro. The DeepGLCM outperforms by achieving the average accuracy of approximate to 92% and 0.96 area under the curve (AUC) for the chromoendoscopy (CH) dataset and approximate to 85% accuracy for Confocal Laser Endomicroscopy (CLE) and white light video endoscopy datasets. It is evident that the DeepGLCM texture features provide a better representation than the traditional texture extraction methods by efficiently dealing with variance in images due to different imaging technologies.
Year
DOI
Venue
2022
10.1016/j.bspc.2022.103733
BIOMEDICAL SIGNAL PROCESSING AND CONTROL
Keywords
DocType
Volume
Classification, Endoscopy, Texture analysis, Gastric cancer, Deep learning
Journal
77
ISSN
Citations 
PageRank 
1746-8094
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Hussam Ali100.34
Muhammad Sharif231737.96
Mussarat Yasmin300.34
Mubashir Husain Rehmani481954.69