Title
Convolutional Neural Network for Automated Mass Segmentation in Mammography
Abstract
Automatic segmentation and localization of lesions in mammogram (MG) images are challenging problems even with employing advanced methods such as deep learning (DL) methods [1]-[3]. To address these challenges, we propose to use a U-Net approach to automatically detect and segment lesions in MG images. U-Net [4] is an end-to-end convolutional neural network (CNN) based model that has achieved remarkable results in segmenting bio-medical images [5]. We modified the architecture of the U-Net model to maximize its precision such as using batch normalization, adding dropout, and data augmentations. The proposed U-Net model predicts a pixel-wise segmentation map of an input full MG image in an efficient way due to its architecture. These pixel-wise segmentation maps help radiologists in differentiating benign and malignant lesions depend on the lesion shapes. The main challenge that most DL methods face in mammography is the need for large annotated training data-sets. To train such DL networks without over-fitting, these networks need thousands or millions of training MG images [1], [3], [5]. In contrast, U-Net is capable of learning from a relatively small training data-set compared to other DL methods [4]. We used publicly available databases, (CBIS-DDSM, BCDR-01, and INbreast), and MG images from the University of Connecticut Health Center (UCHC) to train the proposed U-Net model [3]. The proposed U-Net method is trained on MG images that have mass lesions of different sizes, shapes, margins, and intensity variation around mass boundaries. All the training MG images containing suspicious areas are accompanied by associated pixel-level ground truth maps (GTMs) which indicate the background and breast lesion labels for each pixel. A total of 2066 MG images and their corresponding segmentation GTMs are used to train the proposed U-Net model. Moreover, we applied the adaptive median filter (AMF) and the contrast limited adaptive histogram equalization (CLAHE) filter to the training MG images to enhance its characteristics and improve the performance of the downstream analysis [3].We compared the efficiency of our model with those of the state-of-the-art Faster R-CNN model [6] and the region growing (RG) model [7]. We tested our proposed U-Net method using film-based and fully digitized MG images. The proposed U-Net model shows slightly better performance in detecting true segments compared to the Faster R-CNN model but outperforms it significantly in term of runtime. In addition, the proposed U-Net model gives precise segments of the lesions in the MG images. In contrast, the Faster R-CNN method gives bounding boxes surrounding the lesions. Moreover, the proposed U-Net method performs superior compared to the RG model. Data augmentation has been very effective in our experiments, resulting in an increase in the Dice similarity coefficient from 0.918 to 0.983, between the GTMs and the segmented lesions maps. Also, the proposed model yielded an Intersection over Union (IoU) of 0.974 compared to IoU of 0.966 from the state-of-the-art Faster R-CNN model. In conclusion, the performance of the proposed DL model show promises to make its practical application possible for clinical applications to assist radiologists.
Year
DOI
Venue
2018
10.1109/ICCABS.2018.8542071
2018 IEEE 8th International Conference on Computational Advances in Bio and Medical Sciences (ICCABS)
Keywords
Field
DocType
convolutional neural networks (CNNs),deep learning (DL),U-Net,localization,detection,segmentation,pre-processing,mammograms (MGs),ground truth maps (GTMs),data imbalance
Median filter,Pattern recognition,Biology,Segmentation,Convolutional neural network,Image segmentation,Adaptive histogram equalization,Pixel,Region growing,Artificial intelligence,Bioinformatics,Deep learning
Conference
ISSN
ISBN
Citations 
2164-229X
978-1-5386-8521-1
1
PageRank 
References 
Authors
0.35
3
5
Name
Order
Citations
PageRank
Dina Abdelhafiz151.46
Sheida Nabavi2188.68
Reda A. Ammar314541.09
Clifford Yang461.82
Jinbo Bi51432104.24