An Adaptive Sand Cat Swarm Algorithm Based on Cauchy Mutation and Optimal Neighborhood Disturbance Strategy

Sand cat swarm optimization algorithm (SCSO) keeps a potent and straightforward meta-heuristic algorithm derived from the distant sense of hearing of sand cats, which shows excellent performance in some large-scale optimization problems. However, the SCSO still has several disadvantages, including sluggish convergence, lower convergence precision, and the tendency to be trapped in the topical optimum. To escape these demerits, an adaptive sand cat swarm optimization algorithm based on Cauchy mutation and optimal neighborhood disturbance strategy (COSCSO) are provided in this study. First and foremost, the introduction of a nonlinear adaptive parameter in favor of scaling up the global search helps to retrieve the global optimum from a colossal search space, preventing it from being caught in a topical optimum. Secondly, the Cauchy mutation operator perturbs the search step, accelerating the convergence speed and improving the search efficiency. Finally, the optimal neighborhood disturbance strategy diversifies the population, broadens the search space, and enhances exploitation. To reveal the performance of COSCSO, it was compared with alternative algorithms in the CEC2017 and CEC2020 competition suites. Furthermore, COSCSO is further deployed to solve six engineering optimization problems. The experimental results reveal that the COSCSO is strongly competitive and capable of being deployed to solve some practical problems.


Introduction
Throughout history, optimization issues have been presented in all dimensions of people's lives, such as in finance, science, engineering, etc. Nevertheless, with the development of society, optimization issues have become progressively more intricate. Traditional optimization methods, such as the Lagrange multiplier method, the complex method, queuing theory, and so on, require explicit descriptions of conditions and can only solve smaller optimization problems, which cannot be tackled exactly in a limited time. At the same time, for nonlinear engineering problems with a large quantity of constraints and decision variables, traditional optimization methods tend to get caught in the local optimum instead of sourcing the global optimal solution. Therefore, drawing inspiration from numerous manifestations in nature, researchers have devised a host of powerful and accessible meta-heuristic algorithms that, it is worth noting, can strike a superior balance between hopping out of the topical optimum and converging to a single point in order to arrive at a global optimum and solve sophisticated optimization problems.
The algorithms have been grouped I"to f've principal categories based on the inspiration used to create them: (1). Human-based optimization algorithms are designed based on human brain thinking, systems, organs, and social evolution. An example is the wellknown neural network algorithm (NNA) [1], which tackles problems in ways informed by the message transmission of neural networks in the human brain. The Harmony Search (HS) [2,3] algorithm simulates a musician's ability to achieve a pleasing harmonic state by paper provides an enhancement to tackle the optimization problem with the following primary contributions: (1) The COSCSO with better performance is designed by adding three strategies to SCSO.
In the first place, the nonlinear adaptive parameters replace the original linear parameters to increase the global search and prevent it from being caught in a topical optimum.
In second place, the Cauchy mutation operator strategy expedites the convergence speed.
In the end, optimal neighborhood disturbance enriches population diversity.
(2) The enhanced algorithm is instrumented on test suites of different dimensions and on real engineering optimization problems.
Analyzing the balance of COSCSO exploration and exploitation on the 30-dimensional CEC2019 test suite.
Comparing with other competitive algorithms on the CEC2017 test suite and the CEC2020 test suite of 30 and 50 dimensions.
The improved algorithm is deployed on six engineering optimization problems in conjunction with nine other algorithms.
The remaining details of the paper are described below. The second part describes the relevant work on SCSO, with the third part consisting of a summary review of the original algorithm for sand cat swarm search for attacking prey. The fourth part elaborates on the three improvement strategies in detail. The fifth part presents an analysis of the comparative data of COSCSO, SCSO, and other optimization algorithms, while the superiority of COSCSO is illustrated. In the sixth part, six engineering examples are collected to verify the capabilities of COSCSO with other algorithms in addressing real-world problems. The final part is the conclusion.

Related Works
Since the emergence of the sand cat swarm optimization algorithm, considerable attention has been paid to it by researchers due to its excellence. Vahid Tavakol Aghaei, Amir SeyyedAbbasi et al. [37] applied COSCSO to address three diverse nonlinear control systems for inverted pendulum, Furuta pendulum, and Acrobat robotic arm. It has been shown through simulation experiments that SCSO is simple and accessible and can be a viable candidate for real-world control and engineering problems. In addition, several researchers have optimized the SCSO for greater performance. Firstly, Li et al. [38] designed an elite collaboration strategy with stochastic variation to select the top three sand cats in the population for adaptation, and the three elites assigned different weights cooperated to form a new sand cat position to guide the search process, avoiding the dilemma of being entangled in a local optimum. Secondly, Amir Seyyedabbasi et al. [39] combined SCSO with reinforcement learning techniques to better balance the exploration and exploitation processes and further solve the mobile node localization problem in wireless sensor networks. Finally, the ISCSO proposed by Lu et al. [40] effectively boosts the fault diagnosis performance of power transformers.

The Sand Cat Swarm Optimization
The sand cat swarm optimization (SCSO) algorithm is a remarkably new meta-heuristic optimization algorithm proposed by Amir Seyyedabbasi et al. in 2022. Sand cats live in very barren deserts and mountainous areas. Gerbils, hares, snakes, and insects are their dominant sources of food. In appearance, sand cats are similar to domestic cats, but one big difference is that their hearing is very sensitive and they can detect low-frequency noise below 2 kHz. Therefore, they can use this special skill to find and attack their prey very quickly. The process from discovery to prey capture is shown in Figure 1. We can compare the sand cat's predation to the process of finding the optimal value, which is the inspiration of the algorithm.

Initialization
Originally, it is initialized in a randomized manner so that the sand cats are evenly distributed in the exploration area: where lb and ub are the upper and lower bounds of the variable, and rand is a random number between 0 and 1. The resulting initial matrix is shown below: where x i,j denotes the jth dimension of the ith individual, and there are a total of N individuals and M variables. Meanwhile, the matrix of the fitness function is shown below: After comparing all fitness values, the minimum value is found, and the individual corresponding to it is the current optimal one.

Searching for Prey (Exploration)
The sand cat searches for prey mainly using its very sharp sense of hearing, which can detect low-frequency noise below 2 kHz. Then its mathematical model in the prey-finding stage is shown as follows: r e = S e × rand (0, 1) (5) where S M = 2, S e denotes the general sensitivity range of the sand cats, whose value decreases linearly from 2 to 0, and r e is the sensitivity range of a particular sand cat in the sand cat swarm. t is the immediate count of the iteration, and T depicts the utmost count of iterations for the entire search process. X a (t) is any one of the populations, and X(t) is the immediate position of the sand cat. Notably, when S e = 0, r e = 0, the latest position of the sand cat will also be assigned to 0 according to Equation (6), also in the search space.
Furthermore, in order to guarantee a steady state between the exploration and exploitation phases, R e is put forward, and R e ∈ [0, 2], its value is given by Equation (7).

Grabbing Prey (Exploitation)
As the search process progresses and the sand cat attacks the prey found in the previous stage, its mathematical modeling of the prey attack phase is as follows: where dist is the distance between the best and the current individual. θ is a random angle from 0 to 360.

Bridging Phase
The conversion of SCSO from the exploration phase to exploitation is closely associated with the parameter R e . When |R e | < 1, the sand cat gets in close and captures the prey, which is in the exploitation phase; when |R e | > 1, it continues to search different spaces to find the location of the prey, which is in the exploration phase. The pseudo-code of SCSO is seen in [36]. The mathematical modeling at this time is:

Improved Sand Cat Swarm Optimization
In SCSO, the sand cat uses its powerful ability to recognize lower-profile noise below 2 kHz to capture prey, although the algorithm is straightforward and accessible to implement and allows for iterating quickly until the best position is found. However, there are some shortcomings, such as the tendency to be stuck in the topical optimum and excessive premature convergence. So now this algorithm is optimized and improved. In this paper, three strategies will be taken, namely: nonlinear adaptive parameter, Cauchy mutation strategy, and optimal neighborhood disturbance strategy.

Nonlinear Adaptive Parameters
In SCSO, the parameter S e plays a very prominent role; firstly, it indicates the sensitivity range of the sand cat hearing. Secondly, it influences the size of the parameter R e , which is in turn accountable for equilibrating the global search and local exploitation phases of the iterative process, and thus S e is also a parameter that coordinates the exploration and exploitation phases. Finally, it is also a crucial component of the convergence factor r e , which affects the speed of convergence during the iteration. Whereas in the original algorithm, S e decreases linearly from 2 to 0. This idealized law is not representative of the actual sand cat's predation ability, so a nonlinear adaptive parameter strategy is now utilized with the formula as in Equation (11).
Here, q t = 1 − 2(q t−1 ) 2 , and q t ∈ [0, 1], q t = 0.5. The variation curves before and after the improvement of parameter S e are displayed in Figure 2. Comparing the two curves, we can see that the modified S e has a larger value in the preliminary portion of the optimization process, focusing on the global search; moreover, due to the perturbation of q t , the value of S e sometimes becomes smaller in the optimization process, which can cater for the local search at this time, forming a faster convergence speed and enabling a more precise search accuracy. In the posterior part of the optimization process, the value is on the lower side, focusing on the local search, and due to the perturbation of q t , the value of S e sometimes becomes larger in the optimization process, which ensures that the algorithm avoids becoming bogged down in local optima.

Cauchy Mutation Strategy
The Cauchy distribution is distinguished by long tails at both ends and a larger peak at the central origin. The introduction of the Cauchy mutation operator [41][42][43] as a mutational step provides each sand cat with a greater likelihood of skipping to a better place. Once obtaining the local optimal solution, the Cauchy mutation operator perturbs the step size, making the step size larger, which in turn causes the sand cat to jump away from the local optimal position. Conversely, this operator makes the step size smaller and speeds up the convergence when the individual is pursuing the global optimum. The Cauchy mutation has been integrated with many algorithms, such as MFO and CSO. The Cauchy distribution function and the probability density function of the Cauchy distribution are described as follows: where x 0 is referred to as the position parameter at the maximum and γ is the size parameter of half the distance at half the width of the peak. Here x 0 = 0, γ = 1, the standard Cauchy distribution is obtained, and its probability density function is as in Equation (14), and Figure 3 is the probability density function curve of the standard Cauchy distribution. To diminish the probability of dropping into the local optimum of SCSO, this paper uses the Cauchy mutation operator to promote the global optimization-seeking ability of the algorithm, expedite the convergence speed, and increase the population diversity. Well, at this point, the individual renewal changes to where C(0, 1) is a stochastic number that submits to the standard Cauchy distribution.

Optimal Neighborhood Disturbance Strategy
When a sand cat swarm is feeding, all individuals move towards the location of prey, a circumstance that may account for the homogeneity of the population but is not conducive to the fluidity of the global search phase. Therefore, an optimal neighborhood disturbance strategy [44] is now utilized. When the global optimum is updated, a further search is performed around it. With this, population diversity can be enriched to obviate the need for a local optimum. The optimal neighborhood disturbance is shown as follows: X * best (t) = X best (t) + 0.5 · r 1 · X best (t), X best (t), where X * best (t) is the new individual generated after disturbance, r 1 , r 2 ∈ [0, 1]. After the optimal neighborhood search, the greedy strategy is adopted to opt for judgment. The specific formula is as follows:

COSCSO Steps
In this work, a nonlinear adaptive parameter, a Cauchy mutation strategy, and an optimal neighborhood disturbance strategy are combined to modify the standard SCSO algorithm to form the COSCSO algorithm. The fundamental steps of COSCSO are as follows: Step 1. Initialization, identifying the population magnitude N, the maximum number of iterations T, and the parameters needed.
Step 2. Computing and comparing the fitness value of each sand cat and getting the existing best position.
Step 3. Update the nonlinear parameters S e and the parameters r e , R e by means of Equations (11), (5) and (7).
Step 4. Generate the Cauchy mutation operator.
Step 6. Compare the fitness values of the existing individual, and if the former is better, renew the best individual position.
Step 7. Generate new individuals by perturbing the existing best individual according to the optimal neighborhood disturbance strategy using Equation (16).
Step 8. A comparison of the fitness values of the freshly engendered individual and the best individual in accordance with the greedy strategy, and upgrading the position of the best individual if the former is preferable.
Step 9. Revert to Step 3 if the maximum count of iteration T has not been achieved; otherwise, continue with Step 10.
Step 10. Output the global best position and the corresponding fitness value. For a more concise description of the procedures of the COSCSO algorithm, the pseudo-code of the algorithm is given in Table 1 and the flowchart in Figure 4.

Computational Complexity of COSCSO Algorithm
The computational complexity of an algorithm is defined as the volume of resources it consumes during implementation. When the COSCSO algorithm program is performed, the complexity of each D-dimensional individual in the population is O(D). Then, for a population size of N individuals, its computational complexity is O(N × D), and in the process of finding the best, it needs to be executed T times to get the final result, and the final is O(T × N × D). In the following section, we will test the capability of COSCSO by exploiting different test suites and concrete engineering problems.

Numerical Experiments and Analysis
In this chapter, the balance between the COSCSO exploration and development processes is first discussed. Then, the more challenging CEC2017 test suite and the CEC2020 test suite were selected to test the final performance of COSCSO. COSCSO is evaluated with standard SCSO as well as with an extensive variety of meta-heuristic algorithms, and the values of the required parameters for all algorithms are specified in Table 2. All statistical experiments are conducted on the same computer. In addition, all algorithms are implemented in 20 independent executions of each function, taking N = 50 and T = 1000. And the optimization results are compared by analyzing the average and standard deviation of the best solutions.

Exploration and Exploitation Analysis
Exploration and exploitation play an integral role in the optimization process. Therefore, when evaluating algorithm performance, it is vital to discuss not only the ultimate consequences of the algorithm but also the nature of the balance between exploration and exploitation [45]. Figure 5 gives a diagram of the exploration and exploitation of COSCSO on the 30-dimensional CEC2020 test suite.
As we can observe from the figure, the algorithm progressively transitions from the exploration phase to the exploitation phase. On the simpler basic functions F2 and F4 and the most complex composition function F9, COSCSO moves to the exploitation phase around the 10th iteration and rapidly reaches the top of the exploitation phase, illustrating the greatly enhanced convergence accuracy of COSCSO. On the hybrid functions F5, F6, and F7, COSCSO also preserves a strong exploration ability in the middle and late stages, effectively refraining from plunging into a local optimum.
The results obtained by running COSCSO 20 times with other competing algorithms are given in Table 3. There are 24 test functions ranked first in COSCSO, accounting for about 82.76% of all test functions. At first, on single-peak test functions, COSCSO has a distinct superiority over others in regard to the mean value and can achieve a smaller standard deviation. Next, on the multi-peak test function, although COSCSO is at a weak point compared to PSO in F5 and F6, it is more competitive with the other nine algorithms. Furthermore, on the hybrid functions, except for F15 and F19, COSCSO is obviously superior to other algorithms, especially on F12-F14, F16, and F18, where COSCSO is on the leading edge with respect to mean and standard deviation. Finally, on the synthetic functions, COSCSO is far ahead on F22, F28, and F30, but on F21, it is marginally weaker than PSO and SCSO. The last row of the table shows the average ranking of the ten algorithms. The rankings are: COSCSO > HHO > SCSO > PSO > DO > ATOA > AOA > NCHHO = BWO > RSA. In summary, the COSCSO algorithm has superior merit-seeking ability on the CEC2017 test suite; this fully demonstrates that the three strategies effectively boost convergence accuracy and efficiency and greatly reduce the defects of the initial algorithm. Table 3 depicts the Wilcoxon rank-sum test p-values [54] derived from solving the 30-dimensional CEC2017 problem for 20 runs of other meta-heuristic algorithms at the 95% significance level (α = 0.05), using COSCSO as a benchmark. The last row shows the statistical results, "+" indicates the number of algorithms that outperform the COSCSO, and "=" indicates that there is no appreciable variation among the two algorithms, at this point α = 0.05. "-" indicates the number of times COSCSO outperformed other algorithms. Combining the ranking of each algorithm, we get that COSCSO is significantly superior to RSA, BWO, DO, AOA, NCHHO, and ATOA on all test functions, worse than PSO on F6 and F21, and apparently preferred to PSO on 14 test functions. So, all together, COSCSO has by far better competence compared to other algorithms and is a wise choice for solving the CEC2017 problem.    Figure 6 illustrates the convergence curves of COSCSO with other algorithms on the CEC2017 test functions. Observing the curves, we can see that COSCSO is a dramatic enhancement over SCSO. Although for F5, F6, and F21, COSCSO is at a disadvantage compared to PSO and inferior to the ATOA on F15 and F19, COSCSO is still more superior than the other algorithms. On the remaining functions, COSCSO obviously converges faster and with higher convergence accuracy than SCSO. These advantages are attributed to the improvement of three major strategies of adaptive parameters, Cauchy mutation operator and optimal neighborhood disturbance, which hinder the algorithm from dropping into local optimum and excessive premature convergence.    Figure 7 depicts the box plots of COSCSO with other algorithms on the CEC2017 test functions. The height of the box mirrors the level of swing in the data, and a narrower box plot represents more concentrated data and a more stable algorithm. If there are abnormal points in the data that are beyond the normal range of the data, these points are signaled by a "+". From the figure, we can see that on F1, F3, F4, F11, F12, F14, F15, F17, F18, F27, F28, and F30, the box plot width of the COSCSO is significantly narrower than other algorithms. In addition, except for F22, the COSCSO has almost no outliers. This implies that its operation is more stable and has good robustness in solving the CEC2017 test functions.
Radar maps, also known as spider web maps, map the amount of data in multiple dimensions onto the axes and can give an indication of how high or low the weights of each variable are. Figure 8 shows the radar maps of COSCSO with other algorithms, which are plotted based on the ranking of the ten meta-heuristic algorithms on the CEC2017 test function. From the figure, it can be observed that COSCSO constitutes the smallest shaded area, which further sufficiently illustrates the capacity of COSCSO ahead of the other nine comparative algorithms. The shaded area of HHO ranks second, which indicates that HHO has some competition for COSCSO.

Comparison and Analysis on the CEC2020 Test Suite
In order to further test the COSCSO's optimization-seeking ability, this paper is also tested on the 30-dimensional and 50-dimensional CEC2020 test suites, respectively. The CEC2020 test suite [55] is composed of some of the CEC2014 test suite [56] and the CEC2017 test suite. The algorithms compared with it are eight other optimization algorithms besides SCSO, which include WOA [57], RSA, PSO, CHOA, AOA, HHO, NCHHO, and ATOA. All parameter definitions remain identical except for the number of dimensions.
The experimental results of each algorithm on the 30-dimensional CEC2020 test suite are given in Table 4. From the data, it can be seen that COSCSO is ahead of SCSO and other comparative algorithms on nine test functions. And on F6, the HHO ranks first and the COSCSO ranks second, which is better than the other eight algorithms. The smallest standard deviation on F1, F5, and F7 indicates that COSCSO is more steady on these test functions. The table shows that the overall ranking is COSCSO > HHO > SCSO > PSO > WOA > ATOA > CHOA > AOA > RSA > NCHHO. The average rank of COSCSO is 1.1, which is the first overall rank, and the average rank of HHO is 2.8, which is the second overall rank, which shows that COSCSO is consistently first among all algorithms many times. In addition, Table 4 lists the p-value magnitude of each algorithm, from which it can be seen that COSCSO as a whole outperforms all compared algorithms, especially for the WOA, RSA, PSO, CHOA, AOA, NCHHO, and ATOA, the COSCSO algorithms far ahead. For the HHO and SCSO, there is no major difference in a few test functions. This reveals that COSCSO is extremely feasible for solving the CEC2020 function problem in 30 dimensions. Figure 9 presents the convergence curves of COSCSO with other algorithms on the 30-dimensional CEC2020 test suite. Combining the data in the table visually illustrates that COSCSO has faster convergence and more accurate accuracy on F1, F2, F5, F7, and F8. It is poorer than the HHO on F6.  Figure 10 displays the box plots of COSCSO with other algorithms on the 30-dimensional CEC2020 test function. Where the COSCSO algorithm has the smallest median on F1, F2, F5, F7, and F8 compared to the other nine algorithms. In the plots of F1, F5, F7, F8, and F10, the box plot of COSCSO is narrower, suggesting that the COSCSO algorithm is more stable and has relatively good robustness on these functions.  Figure 11 presents the radar maps based on the ranking of the COSCSO with the other nine algorithms in the 30-dimensional CEC2020 test suite. Depending on the area of the radar maps, it is easy to see that COSCSO ranks at the top in all functions, which very intuitively shows the superiority of COSCSO and its applicability in solving the 30-dimensional CEC2020 problem. Figure 11. Radar maps of COSCSO with other algorithms (30-dimensional CEC2020). Table 5 contains the experimental data for each algorithm for each metric on the 50-dimensional CEC2020 test function. In this experiment, COSCSO achieved better fitness values on the eight test functions. Although inferior to the original algorithm in F2 and F3, the COSCSO algorithm performed competitively compared to the other eight algorithms. The third row from the bottom is the average rank of the ten algorithms. COSCSO has an average rank of 1.4, ranking first. The combined ranking of the algorithms is: COSCSO > SCSO > HHO > PSO > WOA > ATOA > CHOA > RSA > AOA > NCHHO. This fully reflects the ability of the COSCSO algorithm to solve the CEC2020 problem. Rank sum tests are also documented in Table 5. Similarly, the COSCSO was used as a benchmark, and other meta-heuristic algorithms were run 20 times to solve the 50-dimensional CEC2020 problem at the 95% significance level (α = 0.05). Looking at the last row, COSCSO clearly excelled SCSO on the six tested functions; moreover, COSCSO outperformed the other algorithms on most tested functions.
The convergence plots of each function in Figure 12 more directly show its performance in solving the CEC2020 problem. COSCSO surpasses all other algorithms except F2, F3, and F4 and ranks first. In Figure 13, the median is the same for all algorithms except the PSO algorithm on F4. The median of COSCSO is lower than the other algorithms except for F3, F6, and F9. The box plots of COSCSO on F1, F5, F7, and F10 are extremely narrow, indicating its good stability and robustness.  Figure 14 shows the radar maps of COSCSO with other algorithms. Observing the area of each graph, it can be detected that the shaded area of COSCSO is the smallest as well as relatively more rounded, which indicates that COSCSO has more stable and remarkable capability, and COSCSO can be deployed to solve the 50-dimensional CEC2020 problem.

Engineering Problems
This chapter tests the ability of COSCSO to solve practical problems [58]. In the following, ten algorithms are devoted to addressing six practical engineering problems: welded beam, pressure vessel, gas transmission compressor, heat exchanger, tubular column, and piston lever design problems. In particular, the bounded problems are converted into unbounded problems by utilizing penalty functions. In the comparison experiments, N = 30, T = 500, and running times are set to 20.

Welded Beam Design
The objective of the problem is to construct a welded beam [59] with minimal expense under the bounds of shear stress (η), bending stress (λ), buckling load (Q C ) and end deflection (µ) of the beam. It considers the weld thickness h, the joint length l, the height t of the beam, and the thickness b as variants, and the design schematic is shown in Figure 15.
Subject to: Variable range: Ten competitive meta-heuristic algorithms are used to solve this problem in this experiment, which are: COSCSO, SCSO, WOA, AO [60], SCA [61], RSA, HS, BWO, HHO, and AOA. The optimal cost obtained by solving the welded beam design problem using each algorithm and the decision variables it corresponds to are given in Table 6. It is apparent from the table that COSCSO generates the cheapest expenses. Table 7 shows the statistical results obtained for all algorithms run 20 times. It can be noticed that COSCSO obtained the best ranking in all indicators. In conclusion, COSCSO is highly competitive in solving the welded beam design problem.

Pressure Vessel Design
The main purpose of the problem is to fabricate the pressure vessel [62] with the least amount of cost under a host of constraints. It treats shell thickness T 1 , head thickness T 2 , inner radius R*, and the length S of the cylindrical part without head as variables, and let The design schematic is presented in Figure 16. The mathematical model of the problem is shown in Equation (19).
Subject to: Variable range: This problem is solved by ten algorithms, which are COSCSO, SCSO, WOA, AO, HS, RSA, SCA, BWO, BSA [63], and AOA. Table 8 contains the optimal cost of COSCSO and other compared algorithms and their corresponding decision variables. Four more pieces of data for each algorithm are included in Table 9. The result of COSCSO is the best among the ten algorithms and is relatively stable.

Gas Transmission Compressor Design Problem
The key target of the problem [64] is to minimize the total expense of carrying 100 million cubic feet per day. There are three design variables in this problem: the distance between the two compressors (L), the ratio of the first compressor to the second compressor pressure (δ), and the length of the natural gas pipeline inside the diameter (H). The gas transmission compressor is shown in Figure 17. Let K = [k 1 , k 2 , k 3 ] = [L, δ, H]. It is modeled as illustrated in Equation (20).
Variable range: In addition to SCSO, we pick RSA, BWO, SOA [65], WOA, SCA, HS, AO, and AOA to compare with COSCSO. The best results of different algorithms and the corresponding decision variables are summarized in Table 10. The best results of COSCSO are substantially smaller than those of the other algorithms. The statistical results of all algorithms are collected in Table 11, where their standard deviations are the smallest, indicating a high stability of COSCSO.

Heat Exchanger Design
It is a minimization problem for heat exchanger design [66]. There are eight variables and six inequality constraints in this problem. It is specified as shown in Equation (21).
For this problem, nine algorithms, such as WOA and HHO, are compared with COSCSO. Table 12 counts the best results of COSCSO and other algorithms and the best decision variables corresponding to them. The results of each algorithm are listed in Table 13. Apparently, the COSCSO algorithm obtains better results and is very competitive among all ten algorithms.

Tubular Column Design
The goal of this problem is to minimize the expense of designing a tubular column [67] to bear compressive loads under six constraints. It contains two decision variables: the av-erage diameter of the column (D), and the thickness of the tube (b), let K = [k 1 , Its design schematic is depicted in Figure 18. The model of this problem is indicated in Equation (22). min f (K) = 9.8k 1 k 2 + 2k 1 , Subject to: where δ y = 500, E = 0.84 × 10 6 .

Piston Lever Design
The primary goal of the problem [68] is to minimize the amount of oil consumed when the piston lever is tilted from 0 • to 45 • under four constraints, thus determining H, B, D, and K. The schematic is seen in Figure 19. The mathematical expression of the problem is Equation (23).
Besides COSCSO and SCSO, SOA, MVO [69], HHO, etc., were also enrolled in the experiment. By looking at Tables 16 and 17, COSCSO is the best choice within each of these ten algorithms to solve this problem.

Conclusions and Future Work
In this paper, SCSO based on adaptive parameters, Cauchy mutation, and an optimal neighborhood disturbance strategy are proposed. The nonlinear adaptive parameter replaces the linear adaptive parameter and increases the global search, which helps prevent premature convergence and puts exploration and development in a more balanced state. The introduction of the Cauchy mutation operator perturbs the search step to speed up convergence and improve search efficiency. The optimal neighborhood disturbance strategy is used to enrich the species diversity and prevent the algorithm from getting into the dilemma of the local optimum. COSCSO is evaluated against the standard SCSO and other challenging swarm intelligence optimization algorithms at CEC2017 and CEC2020 in distinct dimensions. The comparison of average and standard deviation, convergence, stability, and statistical analysis were performed. It is proven that COSCSO converges more rapidly, with higher accuracy, and stays more stable. In contrast to other algorithms, COSCO is more advanced. What is more, COSCSO is deployed to solve six engineering problems. From the experimental results, it can be visually concluded that COSCSO also has the potential to solve practical problems.
The COSCSO algorithm has strong exploration ability, which can effectively avoid falling into local optimums and prevent premature convergence. However, it has weak exploitation ability and a relatively slow convergence speed. In the future, we can use more novel strategies to improve the algorithm and further improve its exploration speed, which can be made available to tackle more high-dimensional optimization problems and employed in various fields, such as feature selection, path planning, image segmentation, fuzzy recognition, etc.