The DICOM CT Image Compression Based On Enhanced Lossless Prediction And Multilevel Thresholding Based Hybrid Cuckoo Search With Hill Climbing (CS-HC) Algorithm Based Segmentation

- In computer vision applications, image segmentation is a common image processing step. It is used to separate pixels into different groups. The rise in the threshold count would hinder the segmentation phase of images. At the same time, in the field of threshold implementation in the image, it becomes an NT concern. This thesis suggests a multilevel threshold based on optimization techniques to remove ROI and uses enhanced lossless prediction algorithm to compress DICOM images in telemedicine applications. The hybrid Cuckoo search with hill climbing (CS-HC) algorithm strengthens the process used by the search agent to update the optimal solution. This algorithm calculates the threshold value. The superior results are produced by the proposed multilevel level thresholding based on CS-HC, as seen by the simulation results. Optimization is efficient and it has a high degree of convergence. Effective results are provided by the proposed lossless compression algorithm based on classification and blending estimation as compared with JPEG lossless and lossy compression techniques. With various threshold values, the algorithm 's efficiency is checked. To apply this algorithm, Matlab2010a is used and DICOM photos are used to validate it.


Introduction
In computer vision and image recognition, the central challenge is image segmentation. Via image segmentation with homogeneous properties, numerous sub-regions are created. It has a significant impact on bio-medical applications. An automatic segmentation technique can measure the changes in anatomical configurations where modifications lead to diseases in biological tissue. Minimum cross entropy based global thresholding techniques were introduced in for cuboidal cell nuclei in pictures. Photos are hematoxylin-and eosin-stained prostate tissue. An atlas-aided fuzzy c-means (FCM-Atlas) is introduced in for the segmentation of fibro glandular tissue. It estimates breast MRI volumetric density. The details on the multimodality picture are combined by following a sparse representation. Under structural limitations, brain tissue segmentation is integrated.
Image thresholding methods are most commonly used in image segmentation. It has two classifications. They are optimum thresholding and thresholding dependent on property. In the optimal thresholding strategy, the target function is optimised to look for an optimum threshold. Various properties are calculated from the histogram in order to find the threshold in the property-based threshold.
Based on parameters like cross entropy, entropy and class variance, the gray-level regions are segregated. They are used in optimal thresholding methods to measure the threshold value. In Otsu's unsupervised method of selecting a threshold, for example, the class variance is maximised.
Optimum thresholding strategies are both efficient and straightforward in bi-level thresholding. But it entails an unreliable, unsupervised Otsu process. In the multilevel thresholding query, Otsu's unsupervised method increases the computation time. To solve the problems of multilevel thresholding, algorithms like ant colony optimization, Darwinian particle swarm optimization (DPSO) bacterial foraging algorithm, fractional-order Darwinian particle swarm optimization (FODPSO) and particle swarm optimization are used.
Due to its efficacy, the PSO based algorithms and their extensions are most widely used in image segmentation. Darwinian particle swarm optimization DPSO and FODPSO are included in expansion. The DPSO and FODPSO algorithms proposed in solve Otsu problem. The n-level thresholding issue is to reduce an optimization problem that searches for thresholds that maximise RGB (red-green-) objective (fitness) functions.
Premature convergence to sub-optimal solutions is a typical problem in the Cuckoo Search (CS) algorithm inclusive optimization algorithm. The key explanation for this is the inability of CS optimization operators to retain the diversity of multi-generation solutions. By integrating CS with other strategies used for searching, this can be minimised by making hybrid algorithms.
In contrast with the original CS algorithm, the hybrid CS algorithms need more computational time. The hill climbing algorithm will achieve the best solution in a short time by combining it with different local search algorithms. Hence, a hybrid CS-HC algorithm is suggested in this article. The hybrid Cuckoo search with hill climbing (CS-HC) algorithm strengthens the process used by the search agent to update the optimal solution. This algorithm calculates the threshold value.

Proposed Methodology
The CT lung scan image is the dataset used by this suggested algorithm. By pre-processing, the noise in these images is removed [1]. Segmentation methods of multilevel thresholding are used to segment the preprocessed image. By producing several threshold levels, regions of interest and pixel values are extracted from the lung image as shown in Figure 1.
The value of the threshold is optimised. By using a hybrid swarm intelligence method, the best outcomes are achieved. It is a variation of the algorithm for cuckoo quest and hill climbing [2]. To compact DICOM images, an improved lossless prediction algorithm is suggested. It is used in applications for data transfer and storage [3].

Input Dataset
The DICOM CT images of abdomen in real-time are used to evaluate the algorithm. The photos were obtained from the Metro Scans and Research Laboratories, Thiruvananthapuram. The images are captured using the 0.6 mm slice thickness of an Optima CT machine. For segmentation, 7 CT abdomen data sets are used for this work and 6 CT abdomen data sets are used for compression [4].

Pre-processing
Pre-processing is a crucial step in picture handling. Noises are removed in pre-processing; contrast can be improved and weak boundaries and unrelated parts are decided. The medical image portion is impaired by this. Pre-processing techniques are used to analyse these problems. A complete philtre is a Gaussian philtre used in the time domain. It is a philtre with a low pass with less group delay [5]. It is used in both low and high signals to reduce distortion.

Multilevel Thresholding
A similarity-based methodology is the threshold, and it is a fundamental segmentation algorithm. The threshold value defines how it functions [6]. The first class corresponds to pixels with a grey value higher than threshold value, while the second class corresponds to pixels with a grey value lower than the threshold value. Bi-level thresholding refers to the region of interest extracted by a single threshold value from an image. Consider an image with K gray levels, bi-level thresholding is expressed as in Equation (1) and Equation (2) (2) Multiple thresholds are used in multilevel thresholding and image with multiple regions is created as in Equation (3), Equation (4), Equation (5) and Equation To compute the threshold values, parametric or nonparametric techniques are used. The probability density function is estimated by a parametric method for every class [7]. So, it is both time consuming and a complicated process. The non-parametric methods use variables such as error rate, entropy and class variance.
The classical methods used in bi-level thresholding to estimate a threshold value are Otsu and Kapur's technique. The class variance is enhanced by the algorithm of Otsu, where Kapur 's algorithm maximises entropy. With the increase in computational complexity, multilevel thresholding techniques can use these techniques and are noise-sensitive and the threshold value needs to be defined by the user [8].
The results of segmentation techniques based on automatic and iterative thresholding are not up to the expected level. By using an optimization algorithm, the optimum threshold value is selected [9]. A swarm optimization technique is used for multilevel host. An egg can be thrown away or nest are dumped or new nest can be constructed in a new location by a host bird. The basic steps in cuckoo search algorithm are edited. Levy flight is computed to produce a novel solution for i th cuckoo as shown in Equation (7). x where, step size is represented by α. This value depends on the problem. Entry wise multiplication is performed to compute a product. This procedure measures a Levy flight. Depends on the subsequent probability distribution, its step length is distributed as shown in Equation (8). Levy u = t −λ , 1 < ≤ 3 (8) It has infinite variance. A random walk procedure is formed by successive steps or jumps by cuckoo. This corresponds to a power-law step length distribution with important tail.

Algorithm: Cuckoo Search
Determine Objective function f(x), x= (x 1 , x 2 ,…,x d ) Initialize population of n host nests x i (i=1,2,…,d) While (t< MaxGeneration) or (stop criterion); Attain cuckoo (say i) randomly and produce novel solution by levy flights Estimate its quality/ fitness F i Select nest between n (say j) arbitrarily If (F i > F j ) Replace j by novel solution End Throw away fraction (P a ) of worse nests and construct novel ones at novel locations through levy flights Maintain finest solutions or nests with quality solutions Position solutions and acquire current best End while

Post procedure outcomes and visualization End
Hill Climbing (HC) A technique for mathematical optimization is Hill Climbing (HC). This falls within the local search family. It calculates the current state in order to get the neighbourhood's best solution. If it reaches the goal state, it terminates the process. Otherwise, the current state is continuously updated. The updating is done by applying the current state to new operators. There is a two-step system. In the first step, a new operator that is not in the current state is selected and applied to the current state. In the second step, new states are computed.
It is recommended that the above statements move towards a better solution than the current solution. It is the fundamental idea of HC. The quality of the solution is enhanced by this. Based on the problem, HC can be adjusted. This is a major advantage of HC. We can change and customize any aspect of this algorithm. It is used in discrete and conversion domains.

Cuckoo Search Algorithm with Hill Climbing Algorithm
The HC and CS techniques are combined to propose a CS-HC algorithm. The cuckoo species obligate brood parasitic behaviour is used by CSA. CSA combines Levy flight with behaviour of cuckoo species. Levy flight is random with power law step length distribution and it has heavy tail. Some fruit flies and bird's behaviour are used in this. In order get efficient results in global exploration, Levy flight can be used.
The CSA is an efficient metaheuristic swarm-based algorithm. In search space, it maintains a balance between global wide exploration and local nearby exploitation efficiently. There is a chance for slow convergence with worst results. In order to avoid this, HC method for deep exploitation is combined with basic CSA to enhance the ability in search. It is termed as CS-HC algorithm. The benchmark functions are optimized using this.

Pseudo Code: Hybrid CS_HC Start
Define Objective function f(x), x= (x1, x 2 ,…,x d ) Initial a population of n host nests x i (i=1,2,…,d) Define the cuckoo search parameters P a Begin CS Evaluate its quality/ fitness F i Get a cuckoo (say i) randomly and generate a new solution by levy flights Choose a nest among n (say j) randomly If (F i > F j ) Replace j by the new solution End Then Calculate the neighbouring nest Find the maximum of neighbouring nest If (Larger than the current one) Local minimum is found If not again calculate the neighbouring nest If the local minimum is larger than the current local minimum Abandon a fraction (Pa) of worse nests and build new ones at new locations via levy flights Keep the best solutions or nests with quality solutions Rank the solutions and find the current best based on the max iteration End while Final best population of nests End begin CS The number iterations are searched by using standard cuckoo search algorithm in CS-HC. Search is accelerated by passing the obtained best solution to the HC. This rectifies the issue of slow convergence in standard cuckoo search algorithm. HC is an iterative type of algorithm. For a problem, it assigns an arbitrary solution. By changing a single solution element incrementally, it attempts to find the best solution. This incremental changing process continuous until the change produces better results. After algorithm's convergence, CSA is given with solution to check it by using fraction probability . [11] proposed blending predictor. Based on this predictor, this research work proposed a lossless compression algorithm. The image with sharp horizontal edges can be predicted well by the classical predictor. In this work, perform the static predictor on the estimated set of neighbouring pixels. The vector quantization is also having a similar initial stage. The following discussion presents the various stages of lossless compression technique [12], [13].

Classification
The ′ represented the pixel that has to be predicted and neighbouring pixel set is represented by ∅ . Computation of M pixels with minimum distance vector from pixel which has to be predicted in casual context ∅ is the major aim of classification. The following expression shows the distance to the pixel to be predicted as shown in Equation (9) and as shown in Equation (10).
The blending context ∅ is represented by minimum distance vectors of set of M pixels. The cells are formed by grouping pixels based on the Euclidean distance. This process is similar to the vector quantizer.

Blending
The prediction is generated by a blending context ∅ which blends set of static predictors = { 1 , 2 , 3 , … . . }. A penalty factor is coupled with all predictors in set f. Based on the pixel prediction by context ∅ , penalty factor defined as shown in Equation (11). The mean square error represents the penalty as ( , )∈∅

(11) Prediction and Error Correction
The weight of a predictor in a set corresponds to the inverse of penalty . Static prediction is computed by weighted average of it and pixel prediction is done based on this.

Result and Discussion
DICOM medical images are compressed efficiently by the classification and blending prediction algorithm. MATLAB 2013 a is used to implement this algorithm and desktop computer is used to process it. The specifications of the computer include 4 GB RAMS 64bit operating system with Intel Core i3 processor.

Fig. 3: Histogram of input images
Advances in Computing, Communication, Automation and Biomedical Technology DICOM medical images are compressed efficiently by the classification and blending prediction algorithm. It is a lossless method. In initial stage of the CS-HC algorithm, value '0' is assigned to the bird's nest. The search space boundaries are used to set the values of neighbouring nest. The local minimum, local and neighbourhood values are assigned with least value based on the problem nature. The search is accelerated by passing the obtained best solution to the HC. The solutions single element is varied incrementally to get a best solution.
On new solution, perform incremental changes, if good results are produced because of this change. This process is repeated until the algorithm converges. The CSA solution is return after the convergence. Threshold and fraction probability are used to check the returned results.
The input image with its histograms is shown in Figure 2. For different threshold values, FODPSO outputs are shown in Figure 3.
More details are generated by the images having high value of threshold as shown by the resultant images Figure 5 and Figure 6. Population based algorithms are used for optimization and experimental results are random as well as stochastic. The average value computed by executing each technique by 15times because of the stochastic nature. Table 1 shows the average standard objective values. The 7 data sets are represented using ID1 to ID 7. Threshold values used here are 2,3,4 and 5. The experimental results the proved that the proposed algorithm for lossless compression is more efficient than the JPEG lossless and lossy compression algorithms. The JEPG lossy algorithm uses a Discrete Cosine Transform and compression is done using Huffman coder whereas in lossless algorithm run length encoder is used for compression and it used Adaptive predictor with Golomb. The experimental results are shown in Table 2.
The current pixel is predicted using the predictions made by the static predictor. The predictions of it are assigned with a weight and sum of the predictions with its weights corresponds to a current pixel. Penalty factor is inversely proportional to it. The blending contexts efficiency is based on the penalty factor of predictor. The current prediction can be generated precisely, if the predictions made by predictors are correct. The predictors with high penalty factor correspond to inefficient predictor. They are blended out of the current blending contest. Average error � is computed based on the blending context ∅ as, The final predictor of current pixel is defined based on the blending predictor error as expressed below, ( , ) =̂( , ) + �(∅ ) (14) The pixels local property is used to adjust the classification as well as the blending process. The neighbouring pixel's casual set is used to predict current pixel. In local image context, higher order redundancy is removed by classification process.

Conclusion
In this work, medical images are threshold in multilevel using a hill climbing algorithm which is based on cuckoo search. Also proposed a lossless compression algorithm for medical image classification and prediction of blending. The experimental results are better when compared to JPEG lossless and lossy algorithms. The ROI analysis of telemedicine images can be done using this proposed algorithm. The compression of images can also be done effectively for data transfer. In future, segmentation and compression algorithms can be implemented in hardware to make a portable system analysis and transfer of data.