Next Article in Journal
Design of a 90 GHz SOI Fin Electro-Optic Modulator for High-Speed Applications
Next Article in Special Issue
Prediction of Blood Pressure after Induction of Anesthesia Using Deep Learning: A Feasibility Study
Previous Article in Journal
An Initial Dot Encoding Scheme with Significantly Improved Robustness and Numbers
Previous Article in Special Issue
Motion Recognition and an Accuracy Comparison of Left and Right Arms by EEG Signal Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Method for Detecting Architectural Distortion in Mammograms by NonSubsampled Contourlet Transform and Improved PCNN

School of Information Engineering, Zhengzhou University, Zhengzhou 450001, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(22), 4916; https://doi.org/10.3390/app9224916
Submission received: 5 September 2019 / Revised: 19 October 2019 / Accepted: 11 November 2019 / Published: 15 November 2019

Abstract

:

Featured Application

The proposed scheme can efficiently help doctors to perform computer-aided detection of the architectural distortion in mammograms.

Abstract

Breast cancer is the leading cause of cancer death in women, and early detection can reduce mortality. Architectural distortion (AD) is a feature of clinical manifestations for breast cancer, however, due to its complex structure and low detection accuracy, which cause a high mortality of breast cancer. In order to improve the accuracy of AD detection and reduce the mortality of breast cancer, this paper proposes a new method by combining the non-subsampled contourlet transform (NSCT) with the improved pulse coupled neural network (PCNN). Firstly, the top–bottom hat transformation and the exponential transformation are employed to enhance the image. Secondly, the NSCT is employed to expand the overall contrast of the mammograms and filter out the noise. Finally, the improved PCNN by the maximum inter-class variance threshold selection method is employed to complete the AD detection. This proposed approach is tested on the public and authoritative database—Digital Database for Screening Mammography (DDSM). The specificity of the method is 98.73%, the accuracy is 93.16%, and the F1-score is 79.80%, and the area under curve (AUC) of the receiver operating characteristic (ROC) curve is 0.93, these results clearly demonstrate that the proposed method is comparable with those methods in recent literatures. This proposed method is simple, furthermore it can achieve high accuracy and help doctors to perform computer-aided detection of AD effectively.

1. Introduction

In recent years, the incidence and mortality of breast cancer have increased to a great extent, and it has become a major public health problem [1]. According to a report released by the International Agency for Research on Cancer Research (IARC) in 2012, more than 1.67 million women are affected by breast cancer every year in the world, and this number is growing [2]. If the breast cancer is detected early, the 5-year relative survival rate is 99%. If the cancer spreads to a distant part of the body, the 5-year survival rate is 27% [3]. Hence, early diagnosis of breast cancer is particularly important. At present, the digitized screen-filmed mammography is the most effective tool for early detection of breast cancer. However, due to the complex structure of the breast and the quality of the mammograms, doctors are prone to misdiagnosis at the time of diagnosis. Therefore, the computer-aided diagnosis (CAD) system comes into being. A suitable CAD system can provide effective help for radiologists during diagnosis, reduce the workload of doctors, and improve the accuracy of diagnosis [4].
The characteristics of breast cancer mainly include three parts: Mass, calcification, and architectural distortion (AD). AD is a distortion of the mammary gland when the lesion destroys the normal structure of the mammary gland but there is no mass [5]. Due to the complexity of the cause of AD, the limitations of image quality and doctor experience during the examination, radiologists often misjudge AD [5]. Thus, the detection and analysis of AD has become a research hot-spot in the field of CAD breast cancer.
Recently, many computer-aided AD detection algorithms have been proposed. Guo et al. proposed a fractal analysis method to distinguish normal breast and AD of the region of interest (ROI), however, this method required ROI to be set in advance [6]. Ayres et al. proposed using Gabor filters and phase portrait to detect AD by analyzing the texture direction of the mammograms, with a sensitivity of 84% [7]. Ayres et al. proposed a Geometrically Constrained Phase Portrait Model (GCPPM) method, which greatly reduced the false positive rate [8]. Banik et al. proposed a method for detecting AD using the direction texture obtained by the Gabor filter group, with a sensitivity of 88% [9]. Biswas et al. proposed a Gaussian Mixture Model (GMM) to detect AD with a sensitivity of 84% [10]. Anand et al. proposed using contourlet transform and neural network to detect AD, with an accuracy of 64% [11]. Rangayyan et al. proposed employing the angular deviation degree of the linear structure of breast tissue as a feature to detect AD [12]. Zhang et al. proposed to calculate the similarity convergence index of breast speculations for detecting AD, with a sensitivity of 88.9% [13]. Lakshmanan et al. proposed using a high-pass isotropic filter to detect suspicious regions of AD, and identified AD by detecting contours of suspicious regions [14]. Yoshikawa et al. proposed using an adaptive Gabor filter to detect AD, and the filter parameters were adjusted according to the mammary structure [15]. Matsubara et al. proposed to extract the line structure from the mammograms by the normal curvature for detecting AD [16]. Lakshmanan et al. proposed using geometrical properties of edge structures for detecting AD [17]. Narváez et al. extracted the linear structure information in the ROI and the edge, and formed a new feature vector according to different weights, AD was detected by SVM, and with an accuracy of 89% [18]. Akhtar et al. proposed using a radial ridge to detect AD based on the linear characteristics, with a sensitivity of 85% and with a specificity of 80% [19]. Costa et al. proposed using deep learning to detect architectural distortion in mammograms, however, due to the limitation of the dataset size, the accuracy is only 86.1% [20]. These mentioned techniques have made a great contribution. However, due to the uneven background and noise in the mammograms, the rate of misdiagnosing AD is as high as 45% [21]. The high rate of misdiagnosing AD will increase the risk of cancer spread and reduce the survival rate of patients [22]. At present, the main direction of most studies is to detect breast masses and calcification points, the highest accuracy of detecting AD is only 89% [18], so it is of great significance to further improve the accuracy of detecting AD.
In this paper, in order to improve the accuracy of detecting AD, a new detection method is proposed. During pre-processing, the top–bottom hat transformation and the exponential transformation were used to enhance the mammograms, which reduced the uneven illumination mammograms and improve the image contrast. During obtainment of the suspicious AD area, non-subsampled contourlet transform (NSCT) was employed to decompose the mammograms. For high frequency coefficient, we enhanced the coefficient so that amplitude was larger than the threshold, and set the coefficient of that amplitude was smaller than the threshold directly to zero; for low frequency coefficient, pulling the low-pass sub-bands coefficient to expand the overall contrast of the mammograms. Through this processing, the edge detail information of AD was enhanced and the noise was filtered out. During detecting the AD area, the improved pulse coupled neural network (PCNN) completes the AD detection, the optimal threshold calculated by Otsu is set as the initial threshold of the PCNN model, which reduces the number of parameters and the impact of manual setting parameters, this is the advantage of our work. The experimental results show that the proposed method is effective and has higher accuracy, moreover, the proposed method could be considered used in CAD systems to assist the physicians for detecting AD in the future.
The following part of this paper is organized as follows: In Section 2, the structure of the proposed method and relevant theoretical knowledge are presented; in Section 3, the database used in this work is introduced, and the method of evaluating this work is described; in Section 4, experiment results and discussion are performed; finally, in Section 5 the conclusion is drawn.

2. Proposed Method

In the image processing process, in order to acquire the detailed information of the mammograms, wavelet transform and contourlet transform are generally used. For detecting AD, it is necessary to better display the boundary and contour information in the mammograms, but the traditional wavelet transform has a limited number of directions, and lack translation invariance; although the contourlet transform is effective in representing the anisotropy of the image, it may cause aliasing of the image when performing directional filtering, and does not have translation invariance. In order to compensate for this limitation, the NSCT [23] was proposed. The NSCT has the translation invariance that is not available in contourlet transform and wavelet transform, and has the characteristics of multiple scales and multiple directions, which avoid the generation of pseudo Gibbs effects.
The PCNN simulates the synchronous oscillation phenomenon in the visual cortex of animals such as cats and monkeys [24]. PCNN is a third-generation artificial neural network model with deep biological basis. Compared with the traditional neural network model, PCNN could generate synchronous oscillation phenomenon, which is closer to the human visual system. It can realize the segmentation of image and the identification of targets without training, and it is widely used in the field of image processing [25].
In recent years, NSCT and PCNN have penetrated into the field of image processing, but they have not yet been applied to the field of detecting AD. NSCT can better obtain the detailed information of the mammograms to get the high-frequency information of AD. PCNN could complete AD detection accurately without a large of images, which exactly satisfied a little AD data condition. Therefore, this work proposes a method that combines NSCT and Otsu-PCNN to detect AD. The proposed method is shown in Figure 1.
This paper selects the “B_3412_1.LEFT_CC” mammogram in the DDSM database for a comprehensive description of the proposed detection method.

2.1. Pre-Processing

2.1.1. Method of Selection

Due to the unevenness of the breast density rating, the contrast of the mammograms is low, and the overall effect is poor. These reasons have greatly increased the difficulty of the doctor’s diagnosis. In order to improve the contrast of the mammograms and facilitate the detection of AD, this paper proposes a method that combines the top–bottom hat transformation with the exponential transformation. This method not only enhances AD information, but also suppresses the information of surrounding tissue and background. The pre-processing is divided into the following two parts.

The Top–Bottom Hat Transformation

The principle of the top–bottom hat transformation is to utilize the morphological transformation of the image, which could retain some features of the image that can adapt to the structural elements, and remove features that cannot adapt to the structural elements [26]. The implementation process of the top–bottom hat transformation is as follows:
f T ( x , y ) = f ( x , y ) [ ( f ( x , y ) Θ S ) S ]
f B ( x , y ) = [ ( f ( x , y ) S ) Θ S ] f ( x , y )
Here, f ( x , y ) is the input image, S is the selected structural element, f T ( x , y ) is the result of the top hat transformation, f B ( x , y ) is the result of the bottom hat transformation.
The opening operation of the input image excludes the bright details smaller than the structural elements, selects the appropriate structural elements for the top hat transformation, highlights the information of AD, and then performs the bottom hat transformation to obtain the trough information filled by the closed operation, and highlights dark areas in the image. Therefore, after the top–bottom hat transformation, the bright areas are made brighter and the dark areas are made darker, which enhances the contrast of the image. The result is shown in Figure 2.
In Figure 2, the structural elements use “line”, it shows that the length has an effect on the image contrast. As the length increases, the gray value of some areas in the image become unusual big, which will cause some lesion areas to be lost when detecting AD. However, the angle has almost no effect on the image contrast. In this paper, the best enhancement effect is obtained when the structural elements use “line”, the length is 100 and the angle is 100 degrees.

The Exponential Transformation

The contrast of the mammograms after the top–bottom hat transformation is somewhat enhanced, but in order to detect AD more accurately, the contrast of the mammograms needs to be further enhanced. Therefore, this paper chooses the exponential transformation, also known as the Gamma transformation. The Gamma transformation is a nonlinear transform. When γ > 1 , the dynamic range of the low gray value area of the image becomes smaller, and the dynamic range of the high gray value area becomes larger, therefore, the image contrast of the high gray value area is increased, and the image contrast of the low gray value area is reduced. When γ < 1 , the opposite is true [27]. In this paper, γ = 2.5 is set based on experience, the enhancement effect of the mammograms is shown in Figure 3.
In Figure 3, the image contrast has been enhanced obviously through the top–bottom hat transformation and the Gamma transformation. This result is very helpful for the subsequent detection of AD.

2.1.2. Objective Analysis

The above subjective observations are affected by environmental and psychological factors. In order to explain the reliability of the image enhancement method proposed in this paper more clearly, Equivalent Number of Looks (ENL) and Contrast Improvement Index (CII) [28] are used to assess enhanced images.
The ENL and the CII represent the enhancement effect of image contrast, the definition is as follows:
E N L = μ δ
C I I = C 1 C 0
Here, μ is the statistical mean of the input image, δ is the standard deviation of the input image, C 1 is the contrast of the enhanced image, and C 0 is the contrast of the original image. The ENL becomes smaller, the image contrast gets larger, while the overall image displays clearer. When C I I > 1 , it indicates that the image contrast is enhanced, which is larger, the enhancement of the image comes better. The comparison results are shown in Table 1.
In Table 1, the pre-processing method combines the top–bottom hat transformation with the Gamma transformation improves the image contrast greatly.

2.2. Obtain Suspicious AD Area

2.2.1. Non-Subsampled Contourlet Transform

NSCT is a multi-scale, multi-directional decomposition of the image two-dimensional representation method, eliminating the sampling part of the contourlet transform. The structure of the NSCT is shown in Figure 4.
In Figure 4, the NSCT is composed of the non-subsampled pyramid (NSP) and the non-subsampled directional filter bank (NSDFB). The NSP performs multi-scale analysis and divides the image into high frequency sub-bands and low frequency sub-bands; the NSDFB performs multi-directional analysis to decompose the high frequency sub-bands into multiple directions.

2.2.2. Coefficient Operation

Processing High Frequency Coefficient

After the mammograms were decomposed by NSCT, the edge detail information of AD and noise were present in the high frequency sub-bands, and the noise exists in the coefficients with a small absolute value. In order to enhance the edge details of the lesion area and effectively remove noise, we needed to choose a suitable threshold. Because the image was decomposed in different scales and directions, the threshold of every sub-band was different. In order to better estimate the threshold in every sub-band, different thresholds need to be adaptively selected. In [29], the coefficient standard deviation, coefficient mean, and coefficient maximum value of every sub-band were comprehensively considered. The threshold estimation formula is defined as:
T = ( 1 C m h C max h ) T 1
T 1 = 1 2 1 M × N j = 1 M k = 1 N ( C j , k h C m h ) 2
Here, C m h is the mean of the sub-band coefficients of the layer, C max h is the maximum value of the sub-band coefficients of the layer, T 1 is the standard deviation of the coefficients, and T is the threshold of adaptive selection, C j , k h is the high- frequency coefficient of the kth sub-band of the jth scale, and M × N is the size of the image.
In this paper, the coefficient larger than the threshold is used for enhancement processing using the gain function, and the coefficient smaller than the threshold is directly set to zero to filter out noise. Based on the processing method of NSCT high frequency coefficient in [29,30], this paper proposed an improvement of NSCT high frequency coefficient processing according to the actual situation of AD during the experiment, as follows:
C i j H = { a × max ( C i j h ) × [ s i g m ( c ( C i j h max ( C i j h ) b ) ) s i g m ( c ( C i j h max ( C i j h ) + b ) ) ] | C i j h | T 0 | C i j h | < T  
a = 1 s i g m ( c ( 1 b ) ) s i g m ( c ( 1 + b ) )
s i g m ( x ) = 1 1 + e x
Here, C i j h is an unprocessed high frequency coefficient, C i j H is an enhanced high frequency coefficient, T is an adaptive threshold, and b and c are enhancement coefficients. We enhance the coefficient whose amplitude is large than the threshold, and set the coefficient whose amplitude is small than the threshold to zero, namely, the processing performed by Equation (7). Equation (8) is a major coefficient of the enhancement function, and Equation (9) is the definition of the sigm function in Equations (7) and (8). After a lot of experiments, we set b = 0.3 , c = 30 .

Processing Low Frequency Coefficient

The low frequency coefficients obtained by the NSCT of the mammograms mainly contain the basic information of the image, which has a great influence on the image contrast. In order to effectively improve the contrast of the input image and improve the detection accuracy of AD, it is necessary to pull the low-pass sub-bands coefficient to expand the overall contrast of the image. Based on the processing method of NSCT low frequency coefficients in [31], this paper proposes an improvement on the NSCT low frequency coefficient processing according to the actual situation of AD during the experiment, as follows:
C = C l C min C max C min
C l = { f ( C ) C 0.45 0 C < 0.45
f ( x ) = x 1 / 3
C L = C l × ( C max C min ) + C min
Here, C is a process of normalizing the low-frequency coefficient, C l is the input low frequency coefficient, C l is the linearly transformed normalization coefficient, C min is the minimum value of the low frequency coefficient, and C max is the minimum value of the low frequency coefficient. The Formula (11) is linear transformation of the normalized low frequency coefficients. The Formula (12) is the selected transformation function. Finally, obtaining the processed low frequency coefficient by bring the transformed normalization coefficient into Formula (13). The high frequency coefficient and the low frequency coefficient of the NSCT are manipulated by the above processing, and the result of the image reconstruction is shown in Figure 5.
It shows in Figure 5 that through the NSCT processing, the edge details of the lesion area enhanced and the contrast of the mammograms increased.

2.2.3. Objective Analysis

In order to evaluate the contrast enhancement effect more objectively, the above two indicators of ENL and CII are used for comparison. The results are shown in Table 2:
In Table 2, the mammograms undergo NSCT processing with the smallest ENL value and the largest CII value, which indicates that the NSCT enhances the mammograms contrast.

2.3. Detect AD Area

In this section, a new algorithm for AD detection is proposed. The algorithm combines PCNN with improved Otsu, and it obtains the final detection results of AD by marking the maximum connected region.

2.3.1. Improved OTSU

In 1979, Nobuyuki Otsu [32] proposed the maximum inter-class variance threshold selection method. The most important feature of this algorithm is that it can adaptively determine the optimal segmentation threshold of the image. In the threshold selection process, the gray value corresponding to the variance between the target and the background of every pixel in the image, when the maximum variance is obtained, the segmentation threshold is optimal. However, the algorithm is not ideal for segmentation image while the two peaks of the gray histogram is not obvious. Therefore, this work adopts the improved Otsu algorithm in [33], it replaces the mean of the image in the Otsu algorithm with the average variance of the image, as follows:
T = Arg   max 0 T L - 1   [ ω 0 ( σ 0 2 σ 2 ) 2 + ω 1 ( σ 1 2 σ 2 ) 2 ]
Here, σ 0 2 and σ 1 2 are the variances of the gray values of the image pixels of L 0 and L 1 , L 0 , and L 1 are the gray values of two types of image pixels divided by the threshold T , σ 2 is the variance of the full picture pixel gray value, ω 0 and ω 1 are the probability of occurrence of pixel gray values in the range. By incorporating the average variance of the image into the Otsu algorithm, it is possible to effectively segment images with high contrast. Because the pre-processing and NSCT processing enhanced the contrast of the mammograms, this method is in line with the requirements of this research work.

2.3.2. Improved PCNN

The traditional PCNN model has a large number of parameters to be set. The setting of these parameters has a great influence on the detection AD. In order to ensure the stability of the AD accuracy, based on the simplified PCNN model in [34], we incorporate the improved Otsu into the initial threshold setting of the model, only three parameters need to be manually set, which are β , a θ and V θ . Each neuron in the model consists of three parts: input field, modulation field and spike generator. The structure of the model is shown in Figure 6.
In Figure 6, S i j is the gray value of the pixel ( i , j ) of the input image, and ( k , l ) is the pixel adjacent to ( i , j ) ; F i j is the feeding channel; L i j is the linking channel; β is the linking strength coefficient; U i j is the internal activation signal; θ i j is the threshold; Y i j is the pulse outputs of the input pixel; Y k l is the pulse output of the neighborhood pixel; w is the linking weight matrix of channel L ; V θ is the amplification coefficient of the threshold θ i j ; and a θ is the time decay coefficient of threshold θ i j . The mathematical iterative equation for the PCNN model is as follows:
F i j ( n ) = S i j
L i j ( n ) = k , l W i j , k l Y k l ( n 1 )
W i j , k l = { 0 ( i , j ) = ( k , l ) 1 ( i , j ) ( k , l ) 2 ( i , j ) ( k , l )
U i j ( n ) = F i j ( n ) ( 1 + β L i j ( n ) )
Y i j ( n ) = { 1 U i j ( n ) > θ i j ( n 1 ) 0 o t h e r s
θ i j ( n ) = e a θ θ i j ( n 1 ) + V θ Y i j ( n )
Here, W i j , k l is the reciprocal of the Euclidean distance between two adjacent pixel points of the image. It can be seen in the iterative equation of the model that the variable parameters in the model only have β , a θ and V θ . The PCNN parameters are set as shown in Table 3 and Figure 7.
It shows in Table 3 and Figure 7, as the value of V θ increases, the contour of the detection result becomes clearer; as the value of β and a θ increases, the area of the detection result becomes larger and more noise points appear. Moreover, there would be no results can be detected when the value of β and a θ are small. In this section, the optimal threshold calculated by Otsu is set as the initial threshold of the PCNN model, which reduces the impact of manual setting parameters. After a lot of experiments for parameters setting, we set the β to 1.2, a θ to 0.5, and V θ to 28. However, there are some isolated points in the detection results. In order to obtain the accurately detection results, the maximum connected region is employed to remove these isolated points. The final result is shown in Figure 8.
In Figure 8d, the area within the red line is the AD area marked by the expert, and the area within the yellow line is detection result by proposed method, it can be seen that the proposed detection method can accurately detect AD and the contour of the lesion is clear.

3. Material and Assessment Method

The content of this chapter is divided into two parts, one part is to introduce the data used in this work, the other part is to introduce the method of evaluating this work.

3.1. The Choice of Database

In this study, the Digital Database for Screening Mammography (DDSM) database [35] of X-ray mammography from multiple medical institutions in the United States is used to test the proposed method. The database includes approximately 2500 cases, every case includes two images of each breast, along with some associated patient information (age, breast density rating, keyword description of abnormalities) and image information (scanner, spatial resolution, etc.). The information of the lesion area is marked by professional physician. The mark of AD and original image in the DDSM database are shown in Figure 9.
In Figure 9a,b are the mammography image of “B_3060_1.RIGHT_MLO” and “B_3109_1.RIGHT_CC” in the DDSM database. The red areas in (a1,b1) are AD regions marked by the professional physician. In order to avoid the complexity of detecting AD, we consult the reference and conduct experiments to set the size of the mammograms to 1024 × 1024.

3.2. Assessment Method

This paper selects two assessment methods to evaluate the detection results. One method is subjective evaluation and the other is objective evaluation.

3.2.1. Subjective Assessment

In the DDSM database, experts have marked the AD area of the mammograms, so we selected the visual perception evaluation criteria as the subjective evaluation method of the detection results [36]. In order to ensure the objectivity and accuracy of the assessment results, relevant rules were formulated as follows:
Every image was observed for no more than 20 s. For the mammograms with AD, the range of value is 1–10. In order to facilitate the evaluation by the researchers, the lesion area marked by the expert and the detected AD area are combined into one image, every researcher gives a value to every image based on the coincidence area. Then taking the average value of every image as the final value. If the value is larger than 8, the detection result is considered to be correct, otherwise is error.

3.2.2. Objective Assessment

In order to more objectively evaluate our proposed detection method, we use sensitivity, specificity, accuracy, F1-score, and the receiver operating characteristic (ROC) curve as the standard of objective evaluations. The definition is as follows:
S e n s i t i v i t y = T P T P + F N
S p e c i f i c i t y = T N T N + F P
A c c u r a c y = T P + T N T P + F N + T N + F P
F 1 s c o r e = 2 × T P 2 × T P + F P + F N
Sensitivity indicates the ability of the expertly labeled AD area to be detected as AD area by method proposed. Specificity indicates the ability of the expert to label the background area of the mammography image as a background area by method proposed. Accuracy indicates the precision which our proposed method detects AD. F1-score indicates the similarity between the AD area we detected and the actual lesion area. Here, TP, FN, TN, and FP represent True Positive, False Negative, True Negative, and False Positive respectively. Their practical significance is shown in Table 4.
Their values are the number of pixels in the true positive, false negative, true negative and false positive areas obtained by comparing the detected AD with the area marked by the expert. The above four assessment indicators have a numerical range between 0 and 1, and the closer to 1, the better result obtains.
In addition, the ROC curve is used to evaluate the detection results. The True Positive Rate (TPR) is the ordinate and the False Positive Rate (FPR) is the abscissa. The definitions of TPR and FPR are as follows:
T P R = T P T P + F N
F P R = F P F P + T N
The assessment index of the area under curve (AUC) is the area under the ROC curve, the AUC have a numerical range between 0 and 1, and the closer to 1, the better result obtains.

4. Experimental Results and Discussion

In this experiment, the configuration of the computer is Intel Core i5-4590, main frequency 3.3 GHz, memory 4 GB, Windows7 64-bit system, and the simulation software used is 2018b Matlab. In order to facilitate the experimental detection, we select 30 representative mammograms based on the density rating of the image, with 10 images of every density rating.

4.1. Subjective Contrast of Different Images

We invite 20 researchers to participate in the experiment, and their research direction is the detection of breast lesions. They all have rich research experience. During the test, we let the 20 researchers stand in a row, and independently observe the contrast image on the display, every researcher gives a value to every image. The evaluation results are shown in Table 5.
It shows in Table 5 that as the density rating increasing, the subjective evaluation score has a downward trend, which indicates that the density rating has an influence on the detection AD. Among the 30 images, only the assessment results of the two images “B_3085_1.LEFT_MLO” and “B_3395_1.RIGHT_MLO” are detection errors, which indicates that the method proposed in this paper can detect AD accurately.

4.2. Objective Contrast of Different Images

4.2.1. Comparing the Detection Results of Different Images

This section adopts the objective assessment indicators to demonstrate the validity of the proposed method. The results are shown in Table 6.
In Table 6, as the density rating of the image increasing, the specificity and accuracy change slightly. However, the sensitivity and F1-score has a decreasing trend. After analysis, we consider that this change is normal, because the density rating of the image increased, the difference between the AD area and the normal area becomes smaller, so the AD lesion area is difficult to detect, and this has a certain influence on the edge details detection. Finally, we get the mean values of the objective indicators, the specificity is 98.73%, the accuracy is 93.16%, the F1-score is 79.80%, and the sensitivity is 72.50%. Overall, the detection results of the proposed method are a relatively high level. In order to compare the detection results more intuitively, the detection results of different density rating mammograms are shown in Figure 10.
In Figure 10(a1,b1) are images with density rating 1, a2,b2 are images with density rating 2, a3,b3 are the image with density rating 3, the region within the red line is the lesion area marked by the expert, and the area within the yellow line is the detection result by the proposed method. It can be seen that the proposed method in this paper can accurately detect AD area.

4.2.2. Comparison of Different Experimental Methods

This section demonstrates the effectiveness of the proposed method by comparing different experimental methods. First, the image is pre-processed and NSCT processed, then AD detection is performed by using three different methods. These three methods respectively are the proposed method, Otsu, PCNN, the objective assessment indicators of the experimental results are compared, as shown in Table 7.
In Table 7, the data of every method is the average of 30 images detection results. It can be seen that the specificity of the proposed method is 98.73%, the accuracy is 93.16%, and the F1-score is 79.80%, which is higher than the other two methods. However, the sensitivity of the proposed method is lower than the sensitivity of the Otsu. After analysis, this is because the method uses the threshold for binary segmentation, which causes the normal breast tissue close to the AD gray value to be mistaken for AD. Therefore, the detection AD area is too large, thus causing true the positive value is too high. In addition, the comparison of the ROC curve of the three methods is shown in Figure 11.
It shows in Figure 11 that the AUC of the detection method proposed in this paper is up to 0.93. For a more intuitive comparison, the results of different experimental methods for detecting AD in the original image are shown in Figure 12.
The inner area of the red line in Figure 12 is the AD lesion area marked by the expert, the inner area of the yellow line is the AD area detected by the method proposed in this paper, the inner area of the green line is the AD area detected by Otsu, and the inner area of the blue line is AD areas detected by PCNN. It can be seen in Figure 12 that the AD region detected by the proposed method has the highest similarity with the region marked by the expert. In addition, the proposed method is compared with the advanced method for detecting AD. The results are shown in Table 8.
In Table 8, it shows that the sensitivity of the proposed method is lower than that of the four methods, but the specificity and accuracy are higher than those proposed in [18], and the AUC is higher than those proposed in [17,19,20]. The sensitivity is lower, because as the image density rating increases, between the contrast of the AD area and the contrast of the background area becomes smaller and smaller, resulting in a decrease in the number of TP, which in turn causes a decrease in sensitivity. Therefore, in order to ensure the accuracy of AD detection, we can only enhance the AD region in the high frequency part as much as possible in the NSCT processing. This is also one aspect that we will continue to solve in the future. Overall, the method proposed in this paper has certain advancement.

5. Conclusions

This paper combines the NSCT with the Otsu-PCNN for detecting AD in mammograms, this is a new method to detect AD. Firstly, the top–bottom hat transformation and the Gamma transformation are combined to enhance the mammograms. This method not only enhances AD information, but also suppresses surrounding tissue and background information. Secondly, NSCT is employed to decompose the mammograms. For high frequency coefficient, we enhance the coefficient whose amplitude is large than the threshold, and set the coefficient whose amplitude is small than the threshold to zero, through this process to enhance the edge detail information of AD and filter out the noise. For the low frequency coefficient, pulling the low-pass sub-bands coefficient to expand the overall contrast of the mammograms. Finally, the AD is obtained by the improved PCNN model. The optimal threshold calculated by Otsu is set as the initial threshold of the PCNN model, which reduces the impact of manual setting parameters and the complexity of the method, and then removes isolated points by marking the maximum connected region to obtain the final detection result. According to the principle of the proposed model, our method can be applied to other data sets to achieve image detection or segmentation, but the parameters need to be adjusted.
In this work, 30 mammography images of different density rating in the DDSM database are selected for experimental comparison. We divide the data set into three parts based on the image density rating and find out the adaptability of our proposed method to different images, which is the highlights of this work. The subjective evaluation result is that only two images are detection errors for the scores are lower than 8. The objective evaluation results are that the specificity of the proposed method is 98.73%, the accuracy is 93.16%, the F1-score is 79.80%, and the AUC is 0.93. These results fully demonstrate the reliability of the proposed method, and it can effectively help doctors to perform computer-aided detection of AD. These experiments are based on a limited data set. In the future, we are ready to work with local hospitals to increase experimental data sets and make the method more universal.

Author Contributions

Conceptualization, G.D., M.D., Y.S. and S.L.; methodology, G.D., M.D., Y.S., S.L. and X.M.; software, G.D., M.D., S.L., H.W., L.M. and B.L.; validation, G.D., M.D. and S.L.; formal analysis, G.D., M.D., Y.S. and X.M.; investigation, G.D., M.D., S.L., H.W., L.M. and B.L.; data collection, G.D., M.D. and S.L.; writing—original draft preparation, G.D., S.L., H.W., L.M. and B.L.; writing—review and editing, G.D., M.D., Y.S., S.L. and X.M.; project administration, G.D. and M.D.; funding acquisition, G.D., M.D. and X.M.

Funding

This research is funded by “The Key Research Projects of Henan Higher Education Institutions, grant number 18A510017” and “The Henan Postdoctoral Research Project, grant number 001701004”.

Acknowledgments

The experimental data in this article is supported by DDSM database.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Dong, M.; Wang, Z.; Dong, C.H.; Mu, X.M.; Ma, Y.D. Classification of region of interest in mammograms using dual contourlet transform and improved KNN. J. Sens. 2017, 2017, 3213680. [Google Scholar] [CrossRef]
  2. Chen, W.; Zheng, R.; Zhang, S.; Zeng, H.; Xia, C.; Zuo, T.; Yang, Z.; He, J. Cancer incidence and mortality in China in 2013: An analysis based on urbanization level. Chin. J. Cancer Res. 2017, 29, 1–10. [Google Scholar] [CrossRef] [PubMed]
  3. DeSantis, C.E.; Ma, J.; Goding Sauer, A.; Newman, L.A.; Jemal, A. Breast cancer statistics, 2017, racial disparity in mortality by state. CA Cancer J. Clin. 2017, 67, 439–448. [Google Scholar] [CrossRef] [PubMed]
  4. Dong, M.; Lu, X.Y.; Ma, Y.D.; Guo, Y.N.; Ma, Y.R.; Wang, K.J. An efficient approach for automated mass segmentation and classification in mammograms. J. Digit. Imaging 2015, 28, 613–625. [Google Scholar] [CrossRef] [PubMed]
  5. Hu, M.F.; Zhang, H.M.; Yang, J.; Tian, P.; Wu, X.; Xu, H.J.; Shao, X. Application value of digital breast tomosynthesis in diagnosis for breast architectural distortion. China Mod. Dr. 2018, 56, 122–125. [Google Scholar]
  6. Guo, Q.; Shao, J.; Ruiz, V. Investigation of support vector ma-chine for the detection of architectural distortion in mammographic images. J. Phys. Conf. Ser. 2005, 15, 88–94. [Google Scholar] [CrossRef]
  7. Ayres, F.J.; Rangayyan, R.M. Design and performance analysis of oriented feature detectors. J. Electron. Imaging 2007, 16, 023007. [Google Scholar] [CrossRef]
  8. Ayres, F.J.; Rangayyan, R.M. Reduction of false positives in the detection of architectural distortion in mammograms by using a geometrically constrained phase portrait model. Int. J. Comput. Assist. Radiol. Surg. 2007, 1, 361–369. [Google Scholar] [CrossRef]
  9. Banik, S.; Rangayyan, R.M.; Desautels, J.E.L. Detection of architectural distortion in prior mammograms. IEEE Trans. Med. Imaging 2011, 30, 279–294. [Google Scholar] [CrossRef]
  10. Biswas, S.K.; Mukherjee, D.P. Recognizing architectural distortion in mammogram: A multiscale texture modeling approach with GMM. IEEE Trans. Biomed. Eng. 2011, 58, 2023–2030. [Google Scholar] [CrossRef]
  11. Anand, S.; Rathna, R.A.V. Architectural Distortion Detection in Mammogram using Contourlet Transform and Texture Features. Int. J. Comput. Appl. 2013, 74, 12–19. [Google Scholar] [CrossRef]
  12. Rangayyan, R.M.; Banik, S.; Chakraborty, J.; Mukhopadhyay, S.; Desautels, J.E.L. Measures of divergence of oriented patterns for the detection of architectural distortion in prior mammograms. Int. J. Comput. Assist. Radiol. Surg 2013, 8, 527–545. [Google Scholar] [CrossRef] [PubMed]
  13. Zhang, S.J.; Chen, H.J.; Li, Y.F.; Yao, C.; Cheng, L. Detection of Architectural Distortion in Mammograms. Acta Autom. Sinica 2014, 40, 1764–1772. [Google Scholar]
  14. Lakshmanan, R.; Shiji, T.P.; Thomas, V.; Jacob, S.M.; Pratab, T. A Preprocessing Method for Reducing Search Area for Architectural Distortion in Mammographic Images. In Proceedings of the Fourth International Conference on Advances in Computing and Communications, Kochi, India, 27–30 August 2014; pp. 101–104. [Google Scholar]
  15. Yoshikawa, R.; Teramoto, A.; Matsubara, T.; Fujita, H. Automated detection of architectural distortion using improved adaptive gabor filter. In International Workshop on Digital Mammography; Springer: Cham, Switzerland, 2014; pp. 606–611. [Google Scholar]
  16. Matsubara, T.; Ito, A.; Tsunomori, A.; Hara, T.; Muramatsu, C.; Endo, T.; Fujita, H. An automated method for detecting architectural distortions on mammograms using direction analysis of linear structures. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2019; IEEE: Piscataway, NJ, USA, 2015; pp. 2661–2664. [Google Scholar]
  17. Lakshmanan, R.; Jacob, S.M.; Pratab, T.; Thomas, C.; Thomas, V. Detection of architectural distortion in mammograms using geometrical properties of thinned edge structures. Intell. Autom. Soft Comput. 2017, 23, 183–197. [Google Scholar] [CrossRef]
  18. Narváez, F.; Alvarez, J.; Garcia-Arteaga, J.D.; Tarquino, J.; Romero, E. Characterizing Architectural Distortion in Mammograms by Linear Saliency. J. Med. Syst. 2017, 41, 26. [Google Scholar] [CrossRef] [PubMed]
  19. Akhtar, Y.; Mukherjee, D.P. Detection of architectural distortion from the ridges in a digitized mammogram. Signal Image Video Process. 2018, 12, 1285–1292. [Google Scholar] [CrossRef]
  20. Costa, A.C.; Oliveira, H.C.; Catani, J.H.; de Barros, N.; Melo, C.F.; Vieira, M.A. Data Augmentation for Detection of Architectural Distortion in Digital Mammography using Deep Learning Approach. arXiv 2018, arXiv:1807.03167. [Google Scholar]
  21. Durand, M.A.; Wang, S.; Hooley, R.J.; Raghu, M.; Philpotts, L.E. Tomosynthesis-detected Architectural Distortion: Management Algorithm with Radiologic-Pathologic Correlation. RadioGraphics 2016, 36, 311–321. [Google Scholar] [CrossRef]
  22. Li, S.Y.; Dong, M.; Du, G.M.; Mu, X.M. Attention Dense-U-Net for Automatic Breast Mass Segmentation in Digital Mammogram. IEEE Access 2019, 7, 59037–59047. [Google Scholar] [CrossRef]
  23. Da Cunha, A.L.; Zhou, J.; Do, M.N. The nonsubsampled contourlet transform: Theory, design, and applications. IEEE Trans. Image Process. 2006, 15, 3089–3101. [Google Scholar] [CrossRef]
  24. Gao, C.; Zhou, D.; Guo, Y. Automatic iterative algorithm for image segmentation using a modified pulse-coupled neural network. Neurocomputing 2013, 119, 332–338. [Google Scholar] [CrossRef]
  25. Xie, W.Y.; Li, Y.S.; Ma, Y.D. PCNN-based level set method of automatic mammographic image segmentation. OPTIK 2016, 127, 1644–1650. [Google Scholar] [CrossRef]
  26. Wang, W.H.; Wang, W.Q.; Hu, Z.P. Segmenting retinal vessels with revised top-bottom-hat transformation and flattening of minimum circumscribed ellipse. Med. Biol. Eng. Comput. 2019, 57, 1481–1496. [Google Scholar] [CrossRef] [PubMed]
  27. Wang, P.; Liu, F.; Yang, C.; Luo, X. Blind forensics of image gamma transformation and its application in splicing detection. J. Vis. Commun. Image Represent. 2018, 55, 80–90. [Google Scholar] [CrossRef]
  28. Liang, Z.Y.; Gou, X.S. A Segmentation Method for Mammogram X-ray Image Based on Image Enhancement with Wavelet Fusion. Adv. Intell. Syst. Res. 2015, 122–129. [Google Scholar] [CrossRef]
  29. Guo, M.; Jiang, A.M.; Cao, M. New Infrared Image Nonlinear Enhancement Algorithm Based on Nonsubsampled Contourlet Transform. Comput. Mod. 2017, 3, 77–79. [Google Scholar]
  30. Cao, M.; Cheng, Y.L.; Sheng, H.X.; Qiu, C.C.; Yu, K. Application of improved histogram equalization and NSCT transform algorithm in infrared image enhancement. Appl. Sci. Technol. 2016, 43, 24–27. [Google Scholar]
  31. Du, C.; Jia, Z.; Qin, X.; Yang, J.; Hu, Y.; Li, D. Remote Sensing Image Fuzzy Enhancement Algorithm Based on NSCT. Comput. Eng. 2012, 38, 188–190. [Google Scholar]
  32. Otsu, N. A threshold selection method from gray-histogram. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
  33. Chen, X.; Sun, J.; Yin, K.Y.; Yu, J.P. Sea-Land Segmentation Algorithm of SAR Image Based on Otsu Method and Statistical Characteristic of Sea Area. J. Date Acquis. Process. 2014, 29, 603–608. [Google Scholar]
  34. Zhang, P.K.; Zhang, X.; Li, Q.; Zhang, H. Adaptive Segmentation of Image based on Pulse-Coupled Neural Network and Two-dimensional Entropy. Commun. Technol. 2017, 50, 111–114. [Google Scholar]
  35. Heath, M.; Bowyer, K.; Kopans, D.; Moore, R.; Kegelmeyer, P., Jr. The digital database for screening mammography. In Proceedings of the 5th International Workshop on Digital Mammography, Toronto, ON, Canada, 11–14 June 2000; pp. 212–218. [Google Scholar]
  36. Civcik, L.; Yilmaz, B.; Özbay, Y.; Emlik, G.D. Detection of Microcalcification in digitized mammograms with multistable cellular neural networks using a new image enhancement method: Automated lesion intensity enhancer. Turk. J. Electr. Eng. Comput. Sci. 2015, 23, 853–872. [Google Scholar] [CrossRef]
Figure 1. The flow chart of the proposed algorithm. NCST: non-subsampled contourlet transform; AD: architectural distortion; PCNN: pulse coupled neural network.
Figure 1. The flow chart of the proposed algorithm. NCST: non-subsampled contourlet transform; AD: architectural distortion; PCNN: pulse coupled neural network.
Applsci 09 04916 g001
Figure 2. (a) Original region of interest (ROI); (b) AD area marked by expert; (c) top–bottom hat transformation result (length 50, angle 100); (d) top–bottom hat transformation result (length 100, angle 100); (e) top–bottom hat transformation result (length 200, angle 100); and (f) top–bottom hat transformation result (length 100, angle 130).
Figure 2. (a) Original region of interest (ROI); (b) AD area marked by expert; (c) top–bottom hat transformation result (length 50, angle 100); (d) top–bottom hat transformation result (length 100, angle 100); (e) top–bottom hat transformation result (length 200, angle 100); and (f) top–bottom hat transformation result (length 100, angle 130).
Applsci 09 04916 g002
Figure 3. (a) Original ROI; (b) AD area marked by expert; (c) top–bottom hat transformation result; and (d) top–bottom hat transformation and Gamma transformation result.
Figure 3. (a) Original ROI; (b) AD area marked by expert; (c) top–bottom hat transformation result; and (d) top–bottom hat transformation and Gamma transformation result.
Applsci 09 04916 g003
Figure 4. The structure of NSCT: (a) NSP filter; (b) NSDFB frequency decomposition.
Figure 4. The structure of NSCT: (a) NSP filter; (b) NSDFB frequency decomposition.
Applsci 09 04916 g004
Figure 5. (a) Original ROI; (b) AD area marked by expert; (c) pre-processing result; (d) NSCT processing result.
Figure 5. (a) Original ROI; (b) AD area marked by expert; (c) pre-processing result; (d) NSCT processing result.
Applsci 09 04916 g005
Figure 6. Simplified PCNN model.
Figure 6. Simplified PCNN model.
Applsci 09 04916 g006
Figure 7. Detection results of different PCNN parameters. The result of each row in Table 3 is corresponding to the (ag) below.
Figure 7. Detection results of different PCNN parameters. The result of each row in Table 3 is corresponding to the (ag) below.
Applsci 09 04916 g007
Figure 8. (a) Original ROI; (b) PCNN processing results; (c) detection results of removing isolated points; and (d) displaying the detection results on the original image.
Figure 8. (a) Original ROI; (b) PCNN processing results; (c) detection results of removing isolated points; and (d) displaying the detection results on the original image.
Applsci 09 04916 g008
Figure 9. Image with AD in Digital Database for Screening Mammography (DDSM): (a,b) are original images; (a1) and (b1) are the AD image marked by the expert.
Figure 9. Image with AD in Digital Database for Screening Mammography (DDSM): (a,b) are original images; (a1) and (b1) are the AD image marked by the expert.
Applsci 09 04916 g009
Figure 10. Detection results of images of different density rating: (a1) B_3022_1.RIGHT_CC; (b1) B_3109_1.RIGHT_CC; (a2) B_3412_1.LEFT_CC; (b2) B_3052_1.LEFT_MLO; (a3) B_3075_1.RIGHT_MLO; and (b3) D_4077_1.LEFT_CC.
Figure 10. Detection results of images of different density rating: (a1) B_3022_1.RIGHT_CC; (b1) B_3109_1.RIGHT_CC; (a2) B_3412_1.LEFT_CC; (b2) B_3052_1.LEFT_MLO; (a3) B_3075_1.RIGHT_MLO; and (b3) D_4077_1.LEFT_CC.
Applsci 09 04916 g010
Figure 11. Receiver operating characteristic (ROC) curve of the detection results from different experimental methods.
Figure 11. Receiver operating characteristic (ROC) curve of the detection results from different experimental methods.
Applsci 09 04916 g011
Figure 12. The detection results of different experimental methods.
Figure 12. The detection results of different experimental methods.
Applsci 09 04916 g012
Table 1. Comparison of enhancement effects using different methods.
Table 1. Comparison of enhancement effects using different methods.
OriginalTop-Bottom Hat TransformationGamma TransformationProposed
ENL29.28065.63154.42951.6684
CII11.41891.58671.7489
Bold font number indicates the best performance in each class.
Table 2. Comparative analysis.
Table 2. Comparative analysis.
Original ImagePre-Processing ImageNSCT Processing Image
ENL29.28061.66840.6016
CII11.74892.2821
Bold font number indicates the best performance in each class.
Table 3. Set the PCNN parameters.
Table 3. Set the PCNN parameters.
β a θ V θ Result
1.20.527Figure 7a
1.20.528Figure 7b
1.20.529Figure 7c
1.20.228Figure 7d
1.20.828Figure 7e
10.528Figure 7f
1.40.528Figure 7g
Table 4. The concept of confusion matrix.
Table 4. The concept of confusion matrix.
Predict ResultActual Situation
ADNon-AD
ADTPFP
Non-ADFNTN
Table 5. Assessment score of the detection results.
Table 5. Assessment score of the detection results.
Image NameDensityScoreImage NameDensityScore
B_3022_1.RIGHT_CC18.65 ± 0.47B_3412_1.LEFT_MLO28.38 ± 0.57
B_3022_1.RIGHT_MLO18.93 ± 0.18C_0020_1.RIGHT_MLO28.39 ± 0.46
B_3059_1.RIGHT_CC18.85 ± 0.23C_0137_1.LEFT_CC28.13 ± 0.43
B_3109_1.RIGHT_CC18.90 ± 0.20C_0137_1.LEFT_MLO28.35 ± 0.48
B_3372_1.LEFT_CC18.78 ± 0.29C_0170_1.LEFT_CC28.42 ± 0.49
B_3407_1.RIGHT_CC18.70 ± 0.40B_3026_1.RIGHT_MLO38.25 ± 0.67
B_3407_1.RIGHT_MLO18.67 ± 0.36B_3075_1.RIGHT_MLO38.46 ± 0.46
B_3504_1.RIGHT_CC18.25 ± 0.77B_3076_1.LEFT_CC38.10 ± 0.77
B_3504_1.RIGHT_MLO18.40 ± 0.49B_3085_1.LEFT_CC38.17 ± 0.41
C_0047_1.LEFT_CC18.50 ± 0.50B_3085_1.LEFT_MLO37.76 ± 0.58
B_3017_1.LEFT_CC28.30 ± 0.70B_3134_1.RIGHT_CC38.05 ± 0.83
B_3052_1.LEFT_MLO28.25 ± 0.54B_3134_1.RIGHT_MLO38.11 ± 0.72
B_3060_1.RIGHT_MLO28.45 ± 0.74B_3373_1.RIGHT_MLO38.31 ± 0.41
B_3401_1.LEFT_CC28.15 ± 0.62B_3395_1.RIGHT_MLO37.83 ± 0.67
B_3412_1.LEFT_CC28.27 ± 0.56D_4077_1.LEFT_CC38.15 ± 0.80
Table 6. Detection results of images of different density rating.
Table 6. Detection results of images of different density rating.
Image NameDensity RatingSensitivitySpecificityAccuracyF1-Score
B_3022_1.RIGHT_CC190.51%96.73%95.96%84.74%
B_3022_1.RIGHT_MLO197.48%97.58%97.58%79.05%
B_3059_1.RIGHT_CC171.03%99.76%96.17%82.26%
B_3109_1.RIGHT_CC192.11%98.39%96.29%94.31%
B_3372_1.LEFT_CC194.61%96.29%96.20%73.49%
B_3407_1.RIGHT_CC173.34%96.42%92.15%77.55%
B_3407_1.RIGHT_MLO173.00%99.51%93.40%83.60%
B_3504_1.RIGHT_CC171.43%98.09%91.51%80.59%
B_3504_1.RIGHT_MLO181.67%97.38%94.51%84.48%
C_0047_1.LEFT_CC173.93%99.10%95.81%82.18%
B_3017_1.LEFT_CC267.35%99.85%92.70%80.49%
B_3052_1.LEFT_MLO287.28%98.50%97.18%87.93%
B_3060_1.RIGHT_MLO269.86%99.82%91.98%82.02%
B_3401_1.LEFT_CC259.85%99.78%91.88%74.88%
B_3412_1.LEFT_CC266.36%99.97%86.82%79.76%
B_3412_1.LEFT_MLO260.84%99.86%87.13%75.65%
C_0020_1.RIGHT_MLO263.01%99.38%93.32%75.88%
C_0137_1.LEFT_CC275.89%97.04%93.47%79.69%
C_0137_1.LEFT_MLO275.19%99.50%98.16%81.86%
C_0170_1.LEFT_CC264.73%99.40%93.48%77.23%
B_3026_1.RIGHT_MLO364.97%96.76%89.10%74.18%
B_3075_1.RIGHT_MLO378.86%99.31%94.06%87.20%
B_3076_1.LEFT_CC369.83%99.59%93.61%81.44%
B_3085_1.LEFT_CC373.57%99.22%94.84%82.96%
B_3085_1.LEFT_MLO357.56%99.02%87.82%71.86%
B_3134_1.RIGHT_CC363.88%98.92%91.59%76.06%
B_3134_1.RIGHT_MLO365.05%98.36%90.91%76.20%
B_3373_1.RIGHT_MLO357.09%99.88%88.61%72.68%
B_3395_1.RIGHT_MLO355.51%99.89%91.70%71.18%
D_4077_1.LEFT_CC379.35%98.70%96.89%82.65%
Bold font number indicates the best performance in each class.
Table 7. Detection results of different experimental methods.
Table 7. Detection results of different experimental methods.
Experimental MethodSensitivity (%)Specificity (%)Accuracy (%)F1-Score (%)
Proposed72.50 ± 11.1098.73 ± 1.1593.16 ± 3.0779.80 ± 5.13
Otsu94.31 ± 1.2168.28 ± 7.1076.13 ± 7.2673.22 ± 6.82
PCNN68.92 ± 4.1895.54 ± 2.1391.61 ± 1.2869.49 ± 7.42
Bold font number indicates the best performance in each class.
Table 8. Compare proposed methods with advanced methods.
Table 8. Compare proposed methods with advanced methods.
MethodDatabaseSensitivity (%)Specificity (%)Accuracy (%)AUC
ProposedDDSM72.5098.7393.160.93
Lakshmanan et al. [17]DDSM89.8850.90
Fabián et al. [18]DDSM8593890.93
Akhtar et al. [19]DDSM85800.86
Costa et al. [20]DDSM86.10.74
Bold font number indicates the best performance in each class.

Share and Cite

MDPI and ACS Style

Du, G.; Dong, M.; Sun, Y.; Li, S.; Mu, X.; Wei, H.; Ma, L.; Liu, B. A New Method for Detecting Architectural Distortion in Mammograms by NonSubsampled Contourlet Transform and Improved PCNN. Appl. Sci. 2019, 9, 4916. https://doi.org/10.3390/app9224916

AMA Style

Du G, Dong M, Sun Y, Li S, Mu X, Wei H, Ma L, Liu B. A New Method for Detecting Architectural Distortion in Mammograms by NonSubsampled Contourlet Transform and Improved PCNN. Applied Sciences. 2019; 9(22):4916. https://doi.org/10.3390/app9224916

Chicago/Turabian Style

Du, Guangming, Min Dong, Yi Sun, Shuyi Li, Xiaomin Mu, Hongbin Wei, Lei Ma, and Bang Liu. 2019. "A New Method for Detecting Architectural Distortion in Mammograms by NonSubsampled Contourlet Transform and Improved PCNN" Applied Sciences 9, no. 22: 4916. https://doi.org/10.3390/app9224916

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop