Bone metastasis detection method based on improving golden jackal optimization using whale optimization algorithm

This paper presents a machine learning-based technique for interpreting bone scintigraphy images, focusing on feature extraction and introducing a new feature selection method called GJOW. GJOW enhances the effectiveness of the golden jackal optimization (GJO) algorithm by integrating operators from the whale optimization algorithm (WOA). The technique’s performance is evaluated through extensive experiments using 18 benchmark datasets and 581 bone scan images obtained from a gamma camera, including 362 abnormal and 219 normal cases. The results highlight the superior predictive effectiveness of the GJOW algorithm in bone metastasis detection, achieving an accuracy of 71.79% and specificity of 91.14%. The contributions of this study include the introduction of a new machine learning-based approach for detecting bone metastasis using gamma camera scans, leading to improved accuracy in identifying bone metastases. The findings have practical implications for early detection and intervention, potentially improving patient outcomes.

The literature of ML for bone metastasis detection.There are some AI techniques that have been developed to handle bone metastasis detection.For example, Sharma et al. 6 carried out two trials applied to two machine learning models, one with a histogram of oriented gradients feature set and another without hog features, the study's dataset contains 105 images.Eid and Sauber 7 proposed an algorithm that relies on integral using two methods to improve the skeletal scintigraphy image employing 465 datasets, namely, the Salp Swarm algorithm (SSA) and the neutrosophic set (NS).Liu et al. 8 developed a machine learning-based random forest (RF) algorithm model.The accuracy score, specificity, recall rate, and area under the receiver operating characteristic curve (AUC) were used to assess and contrast the effectiveness of the RF paradigm and the other paradigm in terms of making predictions by using 17,138 patients.
Nakajima et al. 9 employed software for calculating BSI, and compared to the original database, The multiinstitutional database significantly improved bone metastases detection, highlighting the significance of having a sufficient number of training databases with a variety of cancer systems, a database including 1532 patients.Avula et al. 10 utilized the k-means clustering algorithm to divide the bone images.In order to detect bone cancer, the segmented picture is further remedied by calculating the mean intensity of the discovered area.Ranjitha et al. 11 developed a method for identifying bone malignant growths in ultrasound scans of bones using a KNN classifier and k-means segmentation to identify a bone illness.Sinthia et al. 12 developed a technique for recognizing bone tumors using MRI.The average filter and the bilateral filter are two pre-processing techniques that are included in the suggested method to eliminate noise and smooth pictures.By computing the k-means approach to calculate the mean intensity and malignancy size, the existence of bone cancer is checked on the MRI bone cancer images to identify its stage.Satheesh et al. 13 advocated using a dataset from a clinical dataset to detect bone cancer.Here, the suggested design consists of two steps for more accurate disorder prediction.The Gray-Level Co-occurrence Matrix (GLCM) method is used in the first step to extract features from a segmented bone image that are based on statistical texture.The second stage involves classifying the extracted features using KNN and a decision tree algorithm.Zhang 14 proposed a novel PSO-based unsupervised feature selection method, called filter-based bare-bone particle swarm optimization algorithm (FBPSO).To hasten the algorithm's convergence, two filterbased techniques are suggested.A local filter search method based on feature redundancy is utilized to increase the swarm's capacity for exploitation, while a space reduction strategy based on average mutual information is employed to quickly remove irrelevant and tangentially relevant features.Bhukya et al. 15 proposed a system for automatically detecting bone cancer to help physicians find the disease early and get patients the treatment they need.A fuzzy C-Means (FCM) and M3-filtered segmentation method on the basis of support vector machines (SVM) is proposed for the detection of bone malignancies.
In addition, Shukla et al. 16 used image segmentation methods such as Prewitt, Sobel, k-means, Canny, and Region Growing, it is stimulable for the interpretation of X-ray and MRI images.The results of edge-based and region-based image segmentation methods applied to X-ray images to identify osteosarcoma cancer on bone are also presented.

The literature of application of MH techniques.
Recently, the MH techniques had more attention since they have been applied to different applications.For example, H. Mohammadzadeh et al. 17 proposed a new approach based on the concept of agents and MASs, It is referred to as Metaheuristic (MAMH) method.
Several primary and robust metaheuristic algorithms are viewed as detached agents in the proposed method, each attempting to achieve its own goals while competing and working with others to achieve common goals.Altogether, the suggested method was examined on 32 complicated benchmark functions, and the results demonstrated its usefulness and power in tackling high-dimensional optimization issues.
Gharehchopogh et al. 18 examined the sparrow search algorithm (SSA), one of the modern and reliable algorithms for resolving optimization issues.And covered all of the SSA literature on improvement, variants, optimization, and hybridization.Piri, J. et al. 19 originally used a technique called artificial gorilla troop optimization (DAGTO) to handle FS jobs in the healthcare industry.Depending on how many and what kind of objective functions there are, and implemented four variations of the suggested method, including: (1) singleobjective (SO-DAGTO), (2) bi-objective (filter wrapper hybrid) (MO-DAGTO2) (3) bi-objective (wrapper) (MO-DAGTO1) and (4) tri-objective (filter wrapper hybrid) (MO-DAGTO3) for identifying pertinent features in diagnosing a specific disease.To increase population diversity and hasten convergence, they offer a superb gorilla initialization technique based on label mutual information (MI).Ten medical datasets are considered to verify the performance of the offered approaches.
Shishavan S.T. and F.S. Gharehchopogh 20 enhanced Cuckoo Search Optimization (CSO) technique with a Genetic technique (GA) for community discovery in complex networks.GA has been highly successful in increasing exploration and exploitation by detecting communities in complex networks.Premature convergence, delayed convergence, and becoming stuck in the local trap are all issues with the CSO algorithm.GA operators were utilized dynamically to improve the speed and accuracy of the CSO.The proposed algorithm was tested with Artificial Bee Colony (ABC), GA, CSO, and grey wolf optimizer (GWO), with various iterations in the modularity and NMI criterion.The results reveal that in most comparisons, the suggested algorithm outperformed the fundamental comparative algorithms.Gharehchopogh, F.S. 21 presented a review of different usages of QC in metaheuristics This review also includes a taxonomy of Quantum-inspired metaheuristic algorithms in optimization issues, as well as discussions of their applications in engineering and science.Their main objectives are to provide an overview of and review the applications of quantum-inspired metaheuristic algorithms.
Shen Y. et al. 22 presented a novel population evolution technique to assist MEWOA in improving its global optimization abilities and avoiding local optimumMEWOA is compared to five modern WOA variations and seven fundamental metaheuristic algorithms over 30 benchmark functions with, respectively, dimensions of 100, 500, 1000 and 2000.On the majority of benchmark functions, it has been found that MEWOA exhibits shorter runtime, achieves faster convergence speed, and offers higher solution accuracy than other methods.Ayar, M. et al. 23 introduced a novel chaotic-based divide-and-conquer (CDC) algorithm to pick the best features out of a feature collection (the UCI Arrhythmia Dataset).In terms of accuracy, sensitivity, specificity, and F-measure, the proposed method produced performance rates of 88.21%, 89.41%, 87.64%, and 86.54%, respectively.
Gharehchopogh F.S. et al. 24 examined a new meta-heuristic algorithm known as the Slime Mould algorithm (SMA) from many optimization points.The fluctuating behavior of slime mold in nature led to the creation of the SMA algorithm.In the aforementioned regions, SMA is used at rates of 15, 36, 7, and 42%, respectively.The results support the assertion that SMA has been successfully used for numerous optimization issues As a result, it is hoped that academic scientists, professionals, and engineers will find this study useful.Mohammed, H. and T. Rashid 25 presented the Fox Optimizer (FOX), a unique nature-inspired optimization algorithm that imitates the foraging habits of foxes when pursuing prey in the wild.The performance of the model is assessed using five traditional benchmark functions and benchmark test functions from CEC2019, and the results demonstrate that the FOX performs statistically much better than the comparison algorithms.
Abdullah J.M. and T. Ahmed 26 proposed the fitness-dependent optimizer (FDO), a revolutionary swarm intelligent algorithm FDO uniquely computes velocity; it leverages the problem fitness function value to generate weights, which guides the search agents throughout both the exploration and exploitation phases.FDO is evaluated on a set of 19 traditional benchmark test functions, and the results are compared to three well-known algorithms: the genetic algorithm (GA), the dragonfly algorithm (DA), and PSO.
The FDO results demonstrate superior performance in the majority of cases.Mohammadi, M. et al. 27 suggested the donkey and smuggler optimization algorithm (DSO).The DSO was inspired by how donkeys look for things.The experimental results were divided into two sections.First, we employed benchmark test functions to compare the algorithm's performance to that of the most well-known and cutting-edge algorithms.Second, three real-world applications-the traveling salesman problem, packet routing, and ambulance routing-are used to adapt and implement the DSO.DSO's experimental results on these real-world situations are quite promising.
Mohammed, H.M., S.U.Umar, and T.A. Rashid 28 gave a systematic and meta-analysis study of WOA to assist researchers in using it in various domains or hybridizing it with other prevalent algorithms.WOA modifications and hybridizations' statistical outcomes are established and compared to the most prevalent optimization techniques and WOA.According to the survey results, WOA outperforms other common algorithms in terms of convergence speed and balancing exploration and exploitation.
Abdulhameed S. and T.A. Rashid 29 introduced a revolutionary metaheuristic child drawing development optimization (CDDO) algorithm inspired by children's learning behavior and cognitive development that uses the golden ratio to optimize the beauty of their artwork CDDO outperforms the 19 benchmark functions in finding the global optimum solution to optimization issues.Its output is compared to the output of multiple cutting-edge algorithms, including, DE, PSO, GSA, FEP, and WOA.The test results reveal that CDDO is relatively competitive, with a score of 2.8.This demonstrates the CDDO exceptional fortitude in seeking a fresh solution.
Rahman C.M. and T.A. Rashid 30 suggested a learner performance-based behavior algorithm (LPB), a revolutionary evolutionary algorithm.To demonstrate the proposed algorithm's accuracy, it is tested against a variety of test functions, including standard benchmark functions, CECC06 2019 test functions, and a real-world case study issue.The suggested algorithm's results are then compared to the DA, GA, and PSO.The proposed method yielded superior outcomes in the majority of situations and comparable results in others.

Methods
The whale optimization algorithm.The whale optimization algorithm (WOA) is a population-based meta-heuristic algorithm simulated by the hunting strategy of humpback whales.When humpback whales locate their prey, they dive twelve meters below the surface before ascending in a spiral of bubbles.Encircling prey, spiral bubble-net feeding maneuver, and prey search are the three primary phases of the location update 32 .
Humpback whales use two methods during exploitation, to alter their positions in the direction of the global optimum: reducing encircling (encircling prey) and helix location updating.They consider the most recent ideal site to be the intended prey (global optimum) (spiral bubble-net feeding maneuver).The mathematical paradigm of the shrinking encircling process is depicted as 32 : In which X is the location vector, X* represents the best solution found thus far, and if a better solution emerges, it will be updated in each iteration.The variables t and |⋅| stand for the present iteration, the utter value operation, and • refers to element-by-element multiplication, respectively.Here, the two parameters A and C are determined by using the following 32 .
where r is a randomized value between [0, 1], a drop linearly from 2 to 0 throughout the course of iterations (during both the exploration and exploitation phases) to enable the achievement of the diminishing conduct that encircles.
A spiral equation utilized to mathematically express the spiral updating position, as 32 .
In this formula, D ′ denotes the separation between the ith whale, and the optimum solution discovered thus far, l is a randomized value in the range [− 1,1], and b is a constant used to define the form of a logarithmic spiral.It is important to note that when whales grab their prey, they simultaneously use a spiral-shaped track and a shrinking surrounding.Each method has a 50% chance of being used to mimic this behavior 32 .
where p is a randomized value between 0 and 1.Here, a global search is created to improve the capacity for discovery.Its mathematical model is comparable to Eqs. ( 5) and ( 6), with the exception that the search is guided by a randomized search agent instead of the best agent.It is decided whether to update the position through exploration (search for prey) or exploitation (a shrinking encircling mechanism) using the randomized variable |A| with a value higher than 1 and lower than − 1, as 32 .
where X rand in this case is a randomized location vector picked from the current generation.

Golden jackal optimization. Golden jackal optimization (GJO) is a powerful metaheuristic algorithm
that replicates the natural hunting behavior of golden jackals.Males and females hunt together frequently with golden jackals.The three stages of the golden jackal's hunting habit are looking for the prey and advancing toward it, surrounding and agitating the prey until comes to a stop, and finally swooping down on the prey 33 .
During the startup step, a collection of prey location matrices are randomly distributed.Generated by the following equation 33 . (1) and C = 2r, (5) X(t + 1) = D ′ e bl cos (2πl) + X * (t), (6)  where where X i,j denotes the j-th dimension of i-th prey, N stands for the numeral of prey populations, and n for dimen- sions with d variables.The golden jackal's hunt can be mathematically described as follows (|E|> 1) 33 : in which t denotes the current iteration, prey(t) represents the prey's position vector, X M (t) denotes the male golden jackal's location, and X FM (t) denotes the female's position.The most recent positions of the male and female golden jackals are X 1 (t) and X 2 (t) , respectively.The prey's evading energy, E, is computed from 33 .
where E 0 is a random value between − 1 and 1, representing the prey's beginning energy, and E 1 indicates the decreasing energy of the prey as 33 .
The maximum number of repetitions is T, c 1 is the default steady and is set to 1.5.E 1 represents the prey's diminishing energy from 1.5 to 0 through the iterations.The term X M (t) − rl • prey(t) in Eqs. ( 11) and ( 12) indicates the space among the golden jackal and the prey, as this distance gets subtracted or added to the current position of the jackal according to the evading energy of the prey.The vector rl is randomized integers set by the Lévy flight function.The multiplication of rl and prey(t) mimics the movement of prey in Lévy movement and can be calculated as 33 .
where the levy flight function LF(x) is calculated from 33 .in which µ and ν are arbitrary values inside (0, 1) and β is the default constant with the value 1.5.Consequently, the jackal positions are updated by taking the mean of Eq. ( 11) & Eq. ( 12), as 33 .
as X(t + 1) is the location of the prey that has been updated by the male and female golden jackals, the prey's evasive energy decreases as a result of the golden jackals' harassment.The golden jackals' mathematical model of how they surround and eat their victims is as follows (|E|≤ 1) 33 .
Proposed method.Four stages make up the suggested method: pre-processing, feature extraction, feature selection, and classification.First, the input image of the bone gamma-camera scanned was segmented.The features retrieved from the segmented image are then utilized to forecast the classification of the bone state using machine learning.The next subsections describe the feature extraction and feature selection phases in depth.Figure 1 shows the phases of the proposed method.
Feature extraction.We provide a variety of algorithms for feature extraction from a medical image.As shown in Fig. 2, they could be nearly categorized into three groups: local features, global features, and feature descriptors 34 .
Global features.Global features consider the entire image composition.Because they can describe a wide range of traits, they are further classified into the subcategories listed below.
-Moment features: In this denomination, features are obtained from various image moments using the HU moments and Zernike algorithms, as well as affine moments and blur.
(10)    -Point features: The algorithms utilized are Laplacian of Gaussian (LoG), Gilles key points, and Law features, these features are based on various key point kinds and their characteristics, such as the quantity or distribution of detected points.-Line features: This category's features are calculated using lines in the image and information are taken from them.using characteristics of linear structures, edges of detected objects, corners, and circles, as well as measurements generated from line profiles.-Region features: The initial set is derived from certain image regions.Regions could be identified objects or patterns in an image, or they can be predetermined portions, Smaller sub-images, for example.The region feature algorithms included are based on the connection of detected objects, saliency, lacunarity, edge-based region (EBR)/intensity extrema-based region (IBR), maximally stable extremal region (MSER), local binary pattern (LBP), and sub-images based on quadtree decomposition or sector division.
The third denomination of features is region covariance descriptors, locally oriented statistics information booster (LOSIB), and speeded-up robust features (SURF), which can express information on both global and/ or local image attributes.
Suggested feature selection algorithm.Feature extraction was followed by feature selection (FS).This is a critical step since a well-chosen set of differentiating features enables the creation of a high-accuracy classifier.The suggested feature selection algorithm is integrated with two algorithms: Golden jackal optimization (GJO) and optimization algorithm (WOA).
The suggested FS algorithm is named GJOW, which compiles the GJO and WOA algorithms.The fundamental structure of the GJO algorithm is improved by enhancing the population's position update phase.This modification integrates the WOA's update mechanism into the major structure of the GJO.This integration gives the GJO more flexibility in discovering the population and ensuring its variability, as well as quickly reaching the optimum values.The suggested GJOW's initial step is to determine the parameters and create the population, which symbolizes a group of solutions to the issue at hand (feature selection).
The efficiency of each solution is then assessed by calculating its fitness function and selecting the best one.The suggested GJOW algorithm's subsequent phase involves updating the current population using either the GJO or WOA algorithm, relying on the effectiveness of the fitness function, the GJO is employed in this case if the fitness function probability for the current solution is larger than 0.15; otherwise, the WOA is utilized.After computing the fitness function for each solution, the best one is selected after updating the population.If the stop conditions are met, the next step is to return to the best solution; if not, repeat the previous steps from calculating the probability to the end.In the paragraphs that follow, these steps are covered in more detail.
The GJOW algorithm begins by specifying the initial parameters for the GJO and the WOA.Next, the GJO produces a random population X of size N in dimension D, after which the GJO measures the fitness of every solution x i , i = 1, 2, • • • , N .However, before determining the objective function, every solution x i is transformed into a binary vector (consisting exclusively of 1's and 0's) based on the value of a randomized threshold ε ∈ [0, 1] using the following equation: The chosen features are therefore represented by only the x i elements that match 1's.Furthermore, the other elements are discarded because they exemplify irrelevant features.The objective function for each x i in the equa- tion must be computed in the following step, as: where Ex i (t) is the classification error of the effective classifier.The second term corresponds to the KNN classi- fier that gives the number of the selected features.The parameter ε ∈ [0, 1] is employed to strike a compromise between the number of picked features and the classification error.The probability of each fitness function P ROI is then computed as: The GJO or WOA will be used to update the existing solution x i in accordance with the P ROI value.For instance, the WOA algorithm is used as described in Sect."The literature of ML for bone metastasis detection".if P ROI > 0.15 ; otherwise, the GJO algorithm is used as described in Sect."The literature of application of MH techniques".
For each updated solution, the fitness function is calculated, and the top update is made.This sequence is repeated until the stopping requirement is satisfied (the suggested GJOW algorithm uses the maximum number of iterations as a stopping condition).Figure 3 shows the main structure of the proposed algorithm.
(20) where K 1 and K 2 refers to number of solutions updated using WOA and GJO, respectively.D stands for the features and T number of iterations.

Results
Data description.We used a database in order to assess the efficacy of our suggested algorithm, we chose 581 patients from this database (362 abnormal and 219 normal) women and men aged between 3 and 90 years who had skeletal scintigraphy imaging due to probable bone metastatic illness.The outcome is an image with two dimensions: an anterior and a posterior.
Each image is broken into two sections of varying brightness, for example, the first image has two parts that range from 80 to 64%, and the second from 39 to 19%, Fig. 4 shows an illustration of skeletal scintigraphy pictures.In addition, we used 23 UCI machine-learning datasets as benchmark to evaluate the performance of developed FS model.

Parameter setting and performance metrics.
The results of the GJOW are compared to seven methods: GJO, BPSO, WOA, EO, HHO, SSA, and ASO.The parameters of each algorithm are assigned according to ( 22) the original implementation as mentioned in Table 1.In addition, the common parameters such as the number of solutions and iterations are set to 15 and 100, respectively.
In addition, seven performance measures are applied: the standard deviation, average (Avg), maximum (Max), and minimum (Min) of the value of the objective function.Moreover, the accuracy (ACC), sensitivity (SEN), and specificity (SPE) are also estimated.

Results and discussion of benchmark datasets.
The comparative results of the developed method are given in Tables 2, 3, 4, 5, 6, 7, 8 and 9. Table 2 presents the average of the objective function for the GJOW and the other methods for twenty-three datasets.In Table 2, the GJOW obtained the top outcomes in 6 of 23 datasets (i.e.Brain_T91, IonosphereEW5, KrvskpEW4, leuk1, and PenglungEW3) whereas the GJOW, GJO, and BPSO showed the best results in 5 datasets; these results ranked the GJOW as the best method amongst the compared algorithms.The BPSO obtained the second grade by getting the best average in four datasets (i.e.Breastcancer4, Lymphography3, Tic-tac-toe3, and base_Vote3).The GJO was ranked third by getting the best average in four datasets (i.e.BreastEW12, base_HeartEW2, SpectEW2, and WaveformEW9).The EO and WOA showed promising outcomes and came in fourth and fifth respectively.Whereas the remaining methods were sorted as follows: HHO, SSA, and ASO, respectively.Figure 5 illustrates the objective functions' average for the datasets.In this figure, the algorithms names are listed in x-axis, while the average values of the objective function are listed in y-axis.The shorter bar (i.e.GJOW) is for the better algorithm whereas, the longer bar indicates the worst algorithm.
Table 3 shows the standard division results for the methods.The BPSO was ranked first, it showed the smallest value in most datasets.The suggested GJOW method got acceptable standard division results in most cases.
The CHCLPSO, on the other hand, displayed the lowest STD values, followed by lshade and LSHADE-SPACMA.After the EO, HHO, SSA, and WOA, the GJO came in third place.The ASO approach revealed the poorest values for a standard division.
Moreover, Table 4 displays the objective function's worst values.In this table, the BPSO and suggested GJOW showed the best results in most datasets and were ranked first and second, respectively.The GJO showed the third-best values followed by EO, HHO, WOA, and SSA.The worst values were shown by ASO.
Table 5 records the best outcome of the objective function for all methods.In this regard, the suggested GJOW showed the minimum values in comparison to the other algorithms, it achieved the best standards in     Parameter analysis of GJOW.In this section, we investigate the performance of GJOW using different values for its parameters: c1 and b.The results of this experiment are presented in Tables 10 and 11.We consider values of c1 as 1.2, 1.5, and 1.7, and values of b as 0.5, 1, and 2.
Table 10 shows that the accuracy of GJOW with c1 values of 1.5 and 1.7 is nearly the same and achieves the best performance across all datasets.However, the accuracy remains consistent for all c1 values across approximately 10 datasets.In terms of Sensitivity and Specificity, GJOW performs best with c1 values of 1.7 and 1.2, respectively.The average performance across the tested datasets is depicted in Fig. 7, which indicates that increasing the value of c1 improves the overall performance.However, the difference in performance between c1 = 1.5 and c1 = 1.7 is negligible.
Furthermore, analyzing the results of GJOW with different values of parameter b (0.5, 1, and 2) as shown in Table 11, we observe the following points.Firstly, the best overall performance is achieved with a b value of 2, suggesting that increasing the value of b enhances the prediction ability of GJOW while keeping c1 fixed at 1.5.Overall, there is no significant difference in the results obtained by varying the parameters c1 or b, as the differences are minimal between them.

Discussion of bone metastasis datasets.
For further analysis, the suggested GJOW was evaluated using real medical data described in Sect."Golden jackal optimization".The outcome is listed in Table 12.The results of different measures for the objective function are shown in Table 12.This table demonstrates that the Figure 6.Average of accuracy measure for the datasets.
suggested GJOW indicated promising outcomes in average, min, max, and standard deviation measures.It came in the first rank with a small difference with HHO and WOA methods.The ASO displayed the worst results among all methods.Additionally, the suggested GJOW obtained the best accuracy and specificity measures.
From the results and discussion of the developed method we can be observed its high ability to detect Bone Metastasis Detection, as well as the high ability to classify the UCI datasets.However, the time complexity of   www.nature.com/scientificreports/ the developed method is considered one of the main limitations that still suffers from it.In addition, the initial population has largest effect on the convergence rate of the GJOW towards the optimal features.

Conclusion and future work
Early identification of bone cancer is crucial due to its hazardous nature.However, accurately detecting the disease poses a significant challenge.This paper addresses this challenge by investigating the effectiveness of an adaptive algorithm called GJOW in improving the accuracy of bone cancer detection in bone scans through feature extraction and selection.The proposed method aims to determine the presence or absence of a tumor in bone scans that have been classified as normal or abnormal.The study contributes to the field by introducing a machine learning-based approach for bone metastasis detection using gamma camera scans, enhancing the Golden Jackal Optimization (GJO) algorithm by incorporating operators from the Whale Optimization Algorithm (WOA), and evaluating a developed feature selection method using real-world bone metastasis datasets.The experimental results highlight the success of the GJOW algorithm, which achieves high classification accuracy.Notably, the proposed method outperforms others across all datasets, with an average accuracy of 97% in the first experiment and the best accuracy of 73% in the second experiment.Future work will involve leveraging a larger dataset to further evaluate the model's performance and exploring additional feature selection methods.

-
Intensity features: These features are dependent on the variety of input image intensities, and the algorithms utilized are various intensity mensuration, image gradients, histograms, and image singular value decomposition.-Geometrical features: The structural qualities of an image are described by features in this category.An example, Fractal dimension characteristics, for example, quantify how self-alike structures inside a picture are, Other geometrical features are dependent on run length, form factor, or gray-level co-occurrence matrices (GLCM).-Transformation features: An image can go through a variety of transformations, and features are therefore calculated from the altered image, the transformation features encompass (discrete cosine, Fourier, distance, Hankel, top-hat, Hough, various unitary) transform, skeletonization, and Gabor filters.

Figure 1 .
Figure 1.Phases of the proposed method.

Figure 2 .
Figure 2. The type of features used in this study.

.
https://doi.org/10.1038/s41598-023-41733-xwww.nature.com/scientificreports/Time complexity of the developed method depends on some factors invluding number of iterations, solutions, and features.So, we can formulated the time complexity of developed method as:

Figure 3 .
Figure 3.The main structure of the proposed algorithm.

Figure 7 .
Figure 7. Average performance at different values of parameter c1.
Local features.Local features, as opposed to global features.The structures of tinier-picture regions are reflected.Depending on the kind of local structure they are extracted from, local features can be divided into three subclasses.

Table 1 .
The parameter value of each algorithm.

Table 2 .
Average of fitness value.Best values are in bold.

Table 3 .
STD results for fitness value.Best values are in bold.

Table 4 .
Max results of fitness value.Best values are in bold.

Table 5 .
Min results of fitness value.Best values are in bold.

Table 6 .
Accuracy results for each FS method.Best values are in bold.

Table 7 .
Sensitivity results of each FS method.Best values are in bold.

Table 8 .
Specificity results of each FS method.Best values are in bold.

Table 9 .
Statistical results of each FS method.Best values are in bold.

Table 10 .
The results of different values of the parameter c1.Best values are in bold.

Table 11 .
The results of different values of parameter b.Best values are in bold.