Enhanced Sparrow Search Algorithm with Mutation Strategy for Global Optimization

In order to improve the performance of sparrow search algorithm (SSA), in this paper, a novel series of SSA variants is proposed by combining SSA with improved Tent chaos mutation (IT), Lévy flights mutation (LF), elite opposition-based learning mutation (EOBL), variable radius mutation (VR) and the combination of IT, LF, EOBL and VR, namely, ITSSA, LFSSA, EOBLSSA, VRSSA and CMSSA, respectively. Initially, the performance of these variants is evaluated on a comprehensive set of 31 benchmark test functions. Moreover, the performance of the best algorithm among these variants is compared with 19 state-of-the-art optimization algorithms to validate its performance on 31 benchmark test functions. The convergence and computational complexity of the best variant is also analyzed to test exploration, exploitation and local optima avoidance. It is then employed on eight real-world constrained engineering problems to further verify its robustness. The experimental results reveal that the best algorithm of SSA variants outperforms other competitors and is highly effective in solving real-life cases.


AEFA
Artificial electric field algorithm IIR Infinite impulse response SSA [47] Spring The k-th best location of sparrows CAPSO Chaos-enhanced accelerated particle swarm optimization A minimum constant to avoid zero denominator 1 The cross-sectional area of first bar The current best solution

I. INTRODUCTION
Optimization problem has been a hot research topic, which is widely used in science, engineering, economy, management and other fields. Over the past few years, deterministic optimization techniques have been widely used to solve these optimization problems. However, these techniques have some limitations to obtain a better solution, especially for multimodal, nonlinear constrained and complex real-world optimization problems. Therefore, to better deal with these cases, meta-heuristic algorithms have been developed rapidly in recent years. The advantages of these methods are the flexibility, gradient-free mechanism and avoiding local optimum. In addition, since they belong to a class of random techniques with different random operators to help them can effectively avoid local optimum for solving real-world problems. Meta-heuristic algorithms are usually divided into evolutionary-based algorithms, swarm-based algorithms, physics-based algorithms, and human-based algorithms. Evolutionary-based algorithms generally are inspired by the process of biological evolution. The individuals of each generation of the population are randomly generated through the previous generation of individuals through selection, reproduction, mutation etc. The individuals of population are constantly updated and iterated to achieve the global optimization. However, the disadvantages of these algorithms are discarding the information of the previous generations of populations, and the high computational complexity due to the large number of operators. A few famous evolutionary-based algorithms are presented in Table 1. TABLE

A FEW FAMOUS EVOLUTIONARY-BASED ALGORITHMS Algorithm
Authors Year Inspiration Genetic Algorithm (GA) [1] Holland John H. 1992 Darwinian theory of evolution Genetic programming (GP) [2] Koza J. R. 1994 Darwinian principle of survival of the fittest and genetic crossover Differential evolution (DE) [3] R. Storn, K. Price 1997 Like genetic algorithms using similar operators; crossover, mutation and selection Fast evolution strategies (FES) [4] Yao X., Liu Y. 1997 Cauchy distribution and evolution strategies Evolutionary programming made faster (FEP) [5] Xin Yao, et al. 1999 Gaussian mutation and Evolutionary programming Biogeography-based optimization (BBO) [6] D. Simon 2008 Mathematics of biogeography Swarm-based algorithms are inspired by the collective intelligence behavior of social creatures (i.e. fireflies, ant lions, grey wolfs, seagulls, etc.). These methods are easy to implement since they have fewer parameters and less operators. Due to the characteristics of information sharing, coevolution and learning among population agents, they can efficiently avoid the local optimum. A few well-known swarm-based algorithms are presented in Table 2.
Physics-based algorithms mimic the physical phenomena (such as black hole, the force of gravity, electricity etc.) in nature. These algorithms have high ability to avoid local optima since the information is exchanged between the candidate solutions. Some of the popular algorithms in this category are shown in Table 3.
Human-based algorithms are inspired by different human behaviors (such as brainstorming process, competition behavior, teaching behavior, learning behavior, etc.). They usually have the characteristics of organization, persistence, simplicity and intelligence. The search strategy of these algorithms is different from that of other types. And the search agents of these algorithms are updated and iterated according to different human behaviors to avoid local optimum. Some of the state-of-art algorithms in this category are shown in Table 4.
Therefore, in recent years, various meta-heuristic algorithms inspired by different concepts are coming out in a rush. However, the metaheuristic frameworks of most these algorithms are similar to some extent [58]. The similarity of these meta-heuristic algorithms is that they can achieve better solutions and have two search stages: exploration and exploitation [59]. The exploration stage is to seek the global optimal solution in the search space as much as possible. The exploitation stage is to further search for the global optimal solution to improve the search accuracy on the basis of obtaining the optimal solution in the exploration stage. Only using exploration may decrease the convergence accuracy, but only using exploitation may increase the probability of getting trapped in a local optimum. Therefore, it is still a more challenging problem that how to seek the balance between exploration and exploitation. In addition, for many problems, these metaheuristic algorithms can obtain a better solution. And metaheuristic algorithms have some limits, likely the parameters of algorithms being set manually [60]. Based on the No Free Lunch theory [61], these popular meta-heuristic algorithms are not fully guaranteed to seek the global optimum for all optimization problems, especially NP-hard optimization problems.  [56] fixing strategy inspired by ACO and an exact large neighborhood search [64], Integrating LP [65], etc. A single algorithm improved with other approaches is another effective way. For instance, combing with Newton's second law and equations of motion, the inclined planes system optimization algorithm (IPO) inspired by the dynamic of tiny ball's sliding motion along frictionless inclined planes was proposed, which effectively handled some singleobjective optimization problems [66]. It has the advantages of high stability, robustness and high convergence efficiency. In recent years, IPO algorithm was used to solve some various optimization problems, such as optimal design of the level shifter circuit [67], the data clustering problem [68], the unsupervised data and histogram clustering problem [69], the automatic design of a neurofuzzy classifier [70], the optimal architecture of MLP neural networks [71], the optimal design of IIR digital filters [72], the IIR system identification problem [73], the IIR system modeling [74], the epileptic seizure detection [75], etc. The multimodal IPO (MMIPO) algorithm was efficiently used to solve the multimodal optimization problems [76]. The fourth-order butterworth filter was effectively and efficiently designed based on the multiobjective inclined planes system optimization (MOIPO) algorithm [77]. The problem of IIR model identification was optimized by a modified inclined planes system optimization (MIPO) algorithm using the appropriate mechanism based on the executive steps of algorithm with the constant damp factors [78]. Compared to other multiobjective methods, the multi-objective modified inclined planes system optimization (MOMIPO) algorithm was better to solve the ring oscillator optimal design [79]. The multi-objective inclined planes system optimization (MOIPO) algorithm was used to handle the optimization of CMOS cross-coupled LC voltage controlled oscillators [80]. A simplified and efficient version of IPO (SIPO) with high reliability and stability was proposed and superior to IPO and MIPO [81]. An adaptive neuro-fuzzy inference system classifier was effectively designed by the variable length inclined planes system optimization algorithm (VLIPO) method [82]. To better seek the balance between exploration and exploitation and the global optimum, the sparrow search algorithm (SSA) was proposed by Jiankai Xue, which was mainly inspired by the sparrow group wisdom, foraging and anti-predation behaviors of sparrows [30]. Compared with PSO, GWO, and GSA, SSA is a superior meta-heuristic algorithm with the advantages of fast convergence rate, higher stability and strong resistance. However, the basic SSA can easily fall into the local optimum for high multimodal and complex problems. To improve the performance of basic SSA, the chaos sparrow search optimization algorithm (CSSA) was proposed by combining the Tent chaotic sequence and Gaussian mutation [83] and the improved sparrow algorithm (ISSA) by combining Cauchy mutation and opposition-based learning [84] was proposed. The adaptive sparrow search algorithm (ASSA) was proposed by introducing the adaptive learning factor, DE/best/1 mutation strategy and dynamic scaling factor to deal with the problem of the optimal parameter identification of the PEMFCs [85]. Combining the center of gravity reverse learning, learning coefficient and Cauchy mutation operators, the improved sparrow search algorithm with a good steady-state performance was applied to track the problem of a distributed maximum power point tracking [86]. By introducing the chaotic map, adaptive inertia weight and Cauchy-Gaussian mutation strategies, the modified sparrow search algorithm (CASSA) was proposed to efficiently solved the UAV route planning problem [87]. The convolutional neural network is optimized by an enhanced sparrow search algorithm (ESSA) classification which improved by the opposition-based learning (OBL) mechanism and the Merit function mechanism [88]. By using the SCA algorithm and labor cooperation structure, an improved sparrow search algorithm (SCA-CSSA) was proposed to solve the labeled and unlabeled data classification problem [89]. Based on the logistic map, the chaotic sparrow search algorithm was utilized into the stochastic configuration network [90]. A lens learning sparrow search algorithm (LLSSA), which combined the reverse learning strategy, variable spiral search strategy and the simulated annealing algorithm, was applied to optimize the 3D UAV path planning problem [91]. The sparrow search algorithm (SSA) was introduced to optimize the extreme learning machine model [92], the parameters of variational mode decomposition method [93], and the penalty factor and kernel function parameter of SVM [94]. By using the K-means clustering, the levy flight mechanism and adaptive local search strategy, the multi-strategy improved sparrow search algorithm (KLSSA) is proposed to overcome the shortcomings of SSA [95]. The hybrid SSA-PSO algorithm was proposed to solve the software defect prediction problem [96]. To solve the large error of DV-Hop, ISSADV-Hop algorithm was proposed based on the improved sparrow search algorithm which introduced the levy flight mechanism [97]. An improved sparrow search algorithm was proposed by the adaptive local search strategy, the improved Tent chaotic map and Cauchy mutation [98]. To deal with the economic optimization of microgrid cluster problem, a chaos sparrow search algorithm was proposed combing the Bernoulli chaotic map, dynamic adaptive weighting, Cauchy mutation, and reverse learning [99].
Although the above methods have improved the search ability and convergence speed of basic SSA, they also have difficulty in avoiding the local optimum to deal with more complex problems. Therefore, it can be seen that the basic SSA is difficult to obtain the global optimal solution, especially for the high-dimensional and complex multimodal problems with a limited number of iterations, and then simple mutation algorithms can still hardly balance the ability of between exploration and exploitation.
In this paper, a novel series of SSA variants (namely, ITSSA, LFSSA, EOBLSSA, VRSSA and CMSSA, respectively) is proposed through the combining SSA with different mutation operators (i.e., improved Tent chaos mutation (IT), Lé vy flights mutation (LF), elite oppositionbased learning mutation (EOBL), variable radius mutation (VR)). In addition, the performance of these variants is evaluated on 31 benchmark test functions. Further, the performance of the best SSA variant is comprehensively tested on 31 benchmark test functions and seven constrained problems. The results show that the best SSA variant is very competitive when compared to other existing methods.
The remaining sections are organized as follows: Initially, a brief review of basic SSA is presented in Section II. In addition, to further balance the exploration and exploitation phases effectively, in Section III, a series of mutation-based SSA is proposed, namely TSSA, LFSSA, EOBLSSA, VRSSA and CMSSA, respectively. Experimental results and the applicability of mechanical engineering problems are implemented in detail in Section IV. And then the conclusion and future work is summarized in Section V.

II. OVERIEW OF SPARROW SEARCH ALGORITHM (SSA)
SSA is a novel meta-heuristic optimization algorithm proposed by Jiankai Xue in 2020, which is mainly inspired by the foraging behavior and anti-predation behavior of sparrows [30]. The main principle of SSA can be simplified as a discoverer-participant mathematical model in the foraging process, and then the reconnaissance and alarming behavior are introduced into the model. The detailed steps of SSA are as follows: Step 1: Assuming that there are N sparrows in the Ddimensional space, the position X of the i-th sparrow in the D-dimensional space can be expressed by Eq. (1).
Step 2: The fitness value is evaluated by Eq. (2). The sparrow with the best fitness value is selected as the discoverers to lead the whole sparrow population to get closer to the food source. The location of the discoverers is updated by Eq. (3). (2) Where k is the number of current iterations, max_iteration is the maximum number of iterations, α is a uniform random number within (0,1], Q is a random number obeying normal distribution within [0,1]. L is a matrix of 1×D for which each element is 1. R2 is a random alarm factor within [0,1] and ST is a safety factor within [0.5,1], respectively. Step 3: The rest of the sparrows are selected as the participants, except the sparrows selected as the discovers. And then the location of the participants is updated by Eq. (4).
Where is the worst solution in the whole search space. +1 is the current best solution obtained by the discoverers. A is a matrix of 1×D for which each element is 1 or -1, and + = ( ) −1 .
Step 4: When the sparrows start to forage, 10%-20% of the sparrows are selected to be on guard. When they find the dangers approaching, both the discoverers and participants will give up the current food and fly to another location. Based on the alarming behavior, the location of sparrows is updated by Eq. (5).
Where β is the step size factor randomly within [0,1], which obeys normal distribution. γ is also the step size factor randomly within [-1,1], which represents the movement direction of sparrows. is a minimum constant to avoid zero denominator. is the fitness value of i-th sparrows, and are the current best solution and worst solution, respectively. The pseudo code of SSA can be expressed in Algorithm 1, and the flowchart of the SSA algorithm is shown in Fig. 1, respectively.

Algorithm 1 Pseudo code of SSA
Set the parameters of SSA, the population of sparrows N, the number of discovers PD, the number of sparrows SD who to be on guard, the safety factor ST, _ , respectively. Initialize the population X. While (k < _ ) Calculate the fitness vaule F by Eq.(2), and the best solution fbest and worst solution fworst by sort (F), respectively.
Update the R2 The location of sparrows is updated by Eq.

III. THE PROPOSED METHODOLOGY
For low-dimensional unimodal and multimodal optimization problems, SSA is characterized by a fast convergence speed and better global convergence ability. However, for some complex problems, especially the high dimensional and multimodal problems, SSA is easier to fall into the local optimum. The performance of basic SSA focuses on the interaction between sparrow individuals. In addition, when most sparrows are trapped into the same local optimum, SSA will slow down and eventually stagnate.
In order to effectively overcome the shortcomings of SSA for complex optimization problems, an effective method called mutation strategy is introduced into SSA. In the mutation strategy, there are four innovations embedded into the basic SSA, which are improved Tent chaos map, Lévy flights, elite opposition-based learning, variable radius perturbation, respectively. In a variety of ways, hybrid mutation strategies are introduced to mutate the population of the basic SSA and to enhance the performance of basic SSA. The ability to quickly move the best solution and enrich the high diversity of the population is improved by the hybrid mutation strategies. In this work, several mutation mechanisms are introduced into the basic SSA. The first, second, third and fourth variants exploit the concept of improved Tent chaos map, Lévy flights, elite opposition-based learning, variable radius distribution, respectively. The fifth variant uses the combination of improved Tent chaos map, Lévy flights, elite oppositionbased learning, variable radius perturbation in three different ways. The detailed variants are as follows.

A. IMPROVED TENT CHAOS-SSA (ITSSA)
Like other traditional metaheuristic algorithms, the basic SSA is easier to fall into local optimum by the weak diversity of the population. Therefore, in this section, the population of SSA is initialized by the improved Tent chaos map to enhance the diversity of the population in SSA. (I) Initialized population by improved Tent chaos map The Tent chaos map plays a great influence on the performance of the optimization algorithm [100] and has the advantage of uniform ergodicity and faster search speed [101]. However, the Tent chaos map also has disadvantages of small periods and unstable periodic points. Therefore, to avoid falling into a small period or unstable periodic point, the Tent chaos map is improved by the rand(0,1) × 1/ as shown in Eq. (6) [102].
Based on improved Tent chaos strategy, the detailed steps to initialize the population of SSA are as follows.
Step2: Set the maximum number of iterations is _ and according to Eq. (6), loop iteration is calculated for i times, chaotic sequence is obtained. Step3: If < _ , save the . Therefore, in ITSSA, the Eq.(1) in SSA is replaced by the Eqs. (6) and (7) to increase the population diversity.

(II) Optimal individual perturbation by improved Tent chaos
When all sparrows find the optimal solution, the optimal sparrow individual is mutated with the improved Tent chaos by the random roulette strategy to improve the global convergence accuracy. Therefore, in ITSSA, the optimal sparrow individual was mutated by Eq. (8) and (9).
( ) can be calculated by Eq. (6) and (7). Thus, the pseudo code of ITSSA can be expressed in Algorithm 2 as follows.

Algorithm 2 Pseudo code of ITSSA
Set the parameters of ITSSA, the population of sparrows N, the number of discovers PD, the number of sparrows SD who to be on guard, the safety factor ST, _ , respectively. Initialize the population X by Eq.(6) and (7). The best solution and positon are updated. Calculate the r by Eq. (8). if (rand < r) the optimal sparrow individual is mutated by Eq. (9).

End if
The best solution and positon are updated. k=k+1; End while Return and

B. ĹVY FLIGHTS-SSA (LFSSA)
In this section, in order to enhance the expansion of search space, and improve the ability to avoid falling into local optimum. The Lé vy flights mutation and the inertia weighting factor are introduced into the basic SSA. In this way, LFSSA can handle global search more effectively. The Lé vy distribution can be expressed by Eq. (10) [103].
Where u and v are both standard normal distribution, β=1.5, and γ is expressed by Eq. (12).
The inertia weighting factor is expressed by Eq.(13). = 1 − / _ (13) Therefore, based on the random roulette strategy, the individual position of sparrows is mutated by the Eq. (14). Then the optimal sparrow individual is also mutated by Eq. (15).
Where ( ) is a randomly distributed number drawn from Lé vy distribution by Eqs. (10)~ (12). Thus, the pseudo code of LFSSA can be expressed in Algorithm 3 as follows.

Algorithm 3 Pseudo code of LFSSA
Set the parameters of LFSSA, the population of sparrows N, the number of discovers PD, the number of sparrows SD who to be on guard, the safety factor ST, _ , respectively. Initialize the population X by Eq.(1).
Calculate the fitness vaule F by Eq. (2), and the best solution fbest and worst solution fworst by sort (F), respectively.
The location of sparrows is updated by Eq. The location of sparrows is updated by Eq. (14).

End if End for
The best solution and positon are updated. if ( < ) the optimal sparrow individual is mutated by Eq. (15).

End if
The best solution and positon are updated. k=k+1; End while Return and

C. ELITE OPPOSITION-BASED LEARNING-SSA (EOBLSSA)
In this section, opposition-based learning mutation can enlarge the search space of the population, and improve the ability to avoid falling into local optimum prematurely and the convergence speed. The individuals with the best fitness are regarded as elite individuals who contain more useful information to guide the population to converge to the global optimum. If the algorithm can finally achieve global optimum, the search space of the global optimum will inevitably include the search space of the elite individuals. Hence, the way to strengthen the search space neighborhood of elite individuals can improve the convergence speed and convergence accuracy of the algorithm. In addition, to improve the diversity of the population, the population of SSA is initialized by the elite opposition-based learning mutation, and when the SSA falls into local optimum, it is perturbed by the elite reverse learning mutation. The EOBLSSA algorithm is proposed by the elite opposition-based learning mutation. Some definitions of elite opposition-based learning are as follows [104],[105], [106].
Where n is the total number of solutions.

End if End for
The best solution and positon are updated. k=k+1; End while Return and

D. VARIABLE RADIUS PERTURBATION-SSA (VRSSA)
In this section, to enhance the ability to avoid falling to the local optimum and to improve the global convergence, the variable radius perturbation operator is introduced into the SSA, as shown in Fig. 2. The variable radius can be expressed by Eq. (19).
Where k is the current iteration, and max_iteration is the maximum iteration, respectively.
Therefore, based on the random roulette strategy, the individual position of sparrows , is mutated by the Eq. (20).
and are the upper and lower bounds of variables, respectively.
In Fig. 2, with the current optimal solution as the reference point, as the number of iterations increases, R gradually becomes smaller and smaller, so that the search space is constantly shrinking. In the early iteration, both R and the search space are larger, which is conducive to further improve the global search ability. In the latter iteration, both R and the search space are relatively smaller, which is conducive to improving the ability of the algorithm to jump out of the local optimum.
The VRSSA algorithm is proposed based on the variable radius perturbation operator, and the pseudo code of VRSSA is shown in Algorithm 5.

Algorithm 5 Pseudo code of VRSSA
Set the parameters of VRSSA, the population of sparrows N, the number of discovers PD, the number of sparrows SD who to be on guard, the safety factor ST, _ , respectively. Initialize the population X by Eq.(1). While (k < _ ) Calculate the fitness vaule F by Eq.(2), and the best solution fbest and worst solution fworst by sort (F), respectively.
The location of sparrows is updated by Eq.

2) End if End for
The best solution and positon are updated. if (rand < R) the optimal sparrow individual is mutated by Eq. (20).

End if
The best solution and positon are updated. k=k+1; End while Return and A variety of mutation operators can improve the local and global search capability of the SSA algorithm to some extent. However, a single mutation operator is often unable to balance between exploration and exploitation comprehensively. In order to further improve the performance of SSA, CMSSA is proposed by the combination of the improved Tent chaos map mutation operator, Lévy flights mutation operator, elite oppositionbased learning mutation operator, and variable radius perturbation mutation operator. Initially, the population of SSA is initialized by the improved Tent chaos map mutation operator to enrich the diversity of population. In addition, after the position of all sparrows is updated for the first time, the updating position mode with the combination of Lévy flights and elite opposition-based learning is introduced into the SSA to improve the global search ability of the algorithm. Finally, the optimal sparrow individual is mutated with the combination of variable radius perturbation operator and improved Tent chaos perturbation operator by the random roulette strategy to improve the local search ability of the algorithm. In the CMSSA, especially, the updating position of sparrows is different from that in the LFSSA, which is shown in Eq. (21).
Thus, the pseudo code and flowchart of CMSSA can be described in Algorithm 6 and shown in Fig. 3.

Algorithm 6 Pseudo code of CMSSA
Set the parameters of CMSSA, such as the population of sparrows N, the number of discovers PD, the number of sparrows SD who to be on guard, the safety factor ST， _ , the number ED of position updated using Lé vy flights, and the number LD of position updated using elite opposition-based learning, respectively.

End if End for
The best solution and position are updated. for m=1: ED The location of elite individuals is updated by Eq. (17).
The elite solution fn is updated by Eq. (18). if ( < ( )) The elite solution and location of elite individuals are updated.
End if End for for l=1: LD The location of the rest sparrows is updated by Eq. (21).
if ( < ( )) The elite solution and location of sparrows are updated.

End if End for
The fitness value and position are updated. is obtainde by Eq. (16). if (rand <R) the optimal sparrow individual is mutated by Eq. (20). else the optimal sparrow individual is mutated by Eq. (9).

End if
The best solution and position are updated. k=k+1; End while Return and

F. COMPUTATIONAL COMPLEXITY ANALYSIS
In this part, all the proposed SSA variants mainly consist of the following phases: initialization, fitness evaluation and sorting, and the sparrow's location update. Among them, N denotes the number of sparrows, D denotes the dimension of functions, T denotes the maximum number of iterations, PD denotes the number of discoverers, SD denotes the number of sparrows who to be on guard, ED denotes the number of sparrows who updated by the EOBL, and LD denotes the number of sparrows who updated by the LF. In SSA, the computational complexity of initialization is     Avg.

IV. EXPERIMENTAL RESULTS
In this section, the proposed algorithms are tested on 31 well-known benchmark test functions [107], [108], [109] and the results are compared to other state-of-the-art algorithms. There are four groups of these benchmark test functions in Appendix A such as unimodal functions (see Table 5), multimodal functions (see Table 6), fixeddimension multimodal functions (see Table 7) and 8 hybrid and composition functions on CEC2017 (see Table 8), respectively. The 2D landscapes of a few benchmark functions are shown in Fig.4. The experiments are mainly divided into four parts. In the first part, the performance of the proposed algorithms is tested on 31 benchmark test functions compared to the basic SSA. In the second part, the scalability test of the best proposed algorithm among these proposed algorithms. In the third part, the performance of the best proposed algorithm is evaluated on 31 benchmark test functions compared to other algorithms.
In the fourth part, to demonstrate the efficiency of the best proposed algorithm, the best proposed algorithm is also employed on eight constrained real-world optimization problems.

B. THE PERFORMANCE EVALUATION OF ALL THE PROPOSED VARIANTS COMPARED TO BASIC SSA
In this section, initially, the parameter tuning on the proposed SSA variants is performed. In addition, the performance of the proposed SSA variants is evaluated on 31 benchmark test functions.

(I) Investigating the influence of parameters on SSA variants
The parameter tuning plays an essential role in the performance evaluation of metaheuristics. The SSA variants mainly involve five parameters namely the maximum number of iterations, number of search agents, PD, SD, and ST. The sensitivity analysis of these parameters has been discussed by varying their values on F1, F5, F12, F13 and F22 functions. All results in this section are obtained under the 30 times independent experiments.
(a) Maximum number of iterations: All proposed algorithms were run by the different maximum number of iterations. The maximum number of iterations was set to 100, 500, 1000, respectively. The obtained Avg. values were shown in Table 9. For all proposed algorithms, the results reveal that the Avg. values become better when the maximum number of iterations increases.
(b) Number of search agents: In order to evaluate the effect of number of search agents on all test functions, for all proposed algorithms, the number of search agents was set to 10, 30, 50, respectively. All proposed algorithms were run by different search agents. Table 10 Table 13. It is observed that ST=0.5 is a proper value for ITSSA, LFSSA and EOBLSSA algorithms in many cases, and ST=0.6 is suitable for the VRSSA algorithm, and ST=0.8 is an advisable value for the CMSSA algorithm on most of the test functions.

(II) The performance of all the proposed algorithms
In this section, to evaluate the performance of the proposed SSA variants more comprehensively, compared to the basic SSA algorithm, the performance of all proposed algorithms (i.e. ITSSA, LFSSA, EOBLSSA, VRSSA and CMSSA) is tested on 31 benchmark test functions. According to the results of the parameter tuning of all proposed algorithms, the parameter settings of all proposed algorithms and SSA are given in Table 14. Table 15 reveals the optimal values obtained by all proposed algorithms compared to the basic SSA algorithm. As it can be seen from  Therefore, the CMSSA algorithm is selected as the further research objective. In order to show the advantages of the CMSSA algorithm, some convergence curves of ITSSA, LFSSA, EOBLSSA, VRSSA, CMSSA and SSA on some benchmark functions (i.e., F3, F5, F7, F9, F11, F13, F15, F22 and F23) are given in Fig. 5. As it can be obviously seen from Fig. 5, in terms of the convergence rate, the CMSSA algorithm ranks the first convergence rate for F3, F5, F7, F9, F11, F13, F15, F22 and F23. However, the basic SSA algorithm ranks the worst for F3, F5, F7, F9, F11, F13 and F15. For many benchmark functions, all the proposed SSA variants are superior to the basic SSA algorithm in terms of the convergence rate. This is due to the improvement of the convergence performance of SSA by various mutation operators. To sum up, these results show that the CMSSA algorithm is the best compared to other proposed variants.
In addition, the results of the Friedman test demonstrate that the CMSSA algorithm can obtain the smallest AGV. index, and then in terms of AVG. index, the CMSSA ranks first, followed by EOBLSSA, LFSSA, VRSSA, ITSSA and the basic SSA. Therefore, it shows that the CMSSA algorithm is still the best method for handling 31 benchmark functions.

C. THE SCALABILITY TEST FOR CMSSA COMPARED TO BASIC SSA
In this section, to further compare the performance of CMSSA and the basic SSA more effectively and comprehensively, the scalability test is carried out. In this test, it focused on the different dimensions of benchmark functions affecting the performance of between the CMSSA and basic SSA. Therefore, the dimensions on F1-F13 are set to 30, 500 and 1000, respectively. The population size is set to 30 and the maximum number of iterations is set to 1000. The scalability results on 13 benchmark functions are shown in Table 16.
As the dimensions gradually increase, it becomes more challenging to obtain the global optimal solution for unimodal and multimodal benchmark functions. As it can be seen from Table 16, when the dimension is set to 30, 500 and 1000, SSA is better than CMSSA on F8 and SSA is equal to CMSSA on F9, F10 and F11, respectively. However, CMSSA outperforms the basic SSA on other benchmark functions when the dimension is set to 500 and 1000, respectively. The convergence curves of some benchmark functions with the dimension of 1000 are described in Fig.  6. As it can be seen in Fig. 6, for different dimension problems, especially for the high dimension problems, the convergence rate of CMSSA is faster than that of the basic SSA, and the ability to jump out of local optimum in CMSSA is better than that of the basic SSA. This may be because the combination of improved Tent chaos, Lévy flights, elite opposition-based learning and variable radius perturbation mutation operator can increase the population diversity and the search space, and strengthen the ability to jump out of the local optimum and the global search ability. All in all, it can be concluded that the CMSSA is superior to the basic SSA in the different dimensions, especially for the high dimension problems.

D. THE PERFORMANCE EVALUATION OF CMSSA
From the above results, it shows that the CMSSA outperforms other SSA variants. To evaluate the performance of the CMSSA in terms of six aspects (i.e., the exploitation capability, the exploration capability, the ability to escape from local minima, the convergence behavior, the statistical testing and the wall-clock time cost), in this section, the CMSSA is compared to some state-of-the-art and advanced meta-heuristic algorithms. All the algorithms are carried out in the same experimental environment and all the experiments are carried out by 30 independent runs. The state-of-the-art and advanced meta-heuristic algorithms are as follows: PSO [7], CS [11], DA [16], GWO [12], WOA [17], MFO [15], SOA [23], SCA [43], MVO [44], SSA [30], EPO [21], STOA [24], TSA [29], SHO [19], RSO [31], TAPSO [112], MPSO [113], IPSO [114] and GWOCS [115], respectively. The parameter settings of these algorithms are shown in Table 17. These results show that CMSSA can obtain the best solution or at least the second-best solution on all unimodal benchmark functions compared with other meta-heuristic algorithms. Therefore, CMSSA is significantly competitive and has a very good exploitation capability.

(b) Analysis of exploration capability (functions F8-F23)
Compared with unimodal functions, multimodal functions (i.e., functions F8-F23) often have multiple local optima, so it is difficult to obtain the global optimum. Therefore, in this experiment, these functions were tested to evaluate the exploration capability of CMSSA compared to PSO, CS, DA, GWO, WOA, MFO, SOA, SCA MVO, EPO, TSA, STOA, SHO and RSO, respectively. The experimental results are shown in Table 18. For F9, F11, F15, F16, F17, F18,  F19, F21  The results show that CMSSA is better than other competitors for many multimodal functions so that CMSSA has a very good exploration capability. This is due to the mixing of various mutation operators to improve the ability to search for the global optimum in CMSSA.

(c) Ability to escape from local minima (functions F24-F31)
To better evaluate the ability to escape from local The results demonstrate that CMSSA is better than other algorithms for many functions. Therefore, it shows that the CMSSA algorithm still has a very good ability to escape from local minima. This may be due to the sparrow individual mutated by the introduction of a variety of mutation operators to the ability to escape from local minima in CMSSA.

(d) Analysis of convergence behavior
In order to better evaluate the convergence rate of the CMSSA algorithm, some convergence curves of PSO, CS, DA, GWO, WOA, MFO, SOA, SCA, MVO, EPO, TSA, STOA, SHO, RSO and CMSSA on 9 benchmark functions were provided in Fig. 7, and the convergence curves of SCA, WOA, TAPSO, MPSO, IPSO, SSA GWOCS and CMSSA on 6 hybrid and composition benchmark functions were provided in Fig. 8. As it can be seen from Fig. 7 and 8, the CMSSA has a faster convergence rate than other algorithms in most cases except for F5 and F24. The convergence behavior of CMSSA can be divided into three types. In the first type, with the increase of the number of iterations, the convergence rate of CMSSA is gradually accelerated, which is evident in F1, F10, F25, and F28. In the second type, the best solution is attained by the last iteration phase, which is evident in F7, F21 and F23. In the third type, the convergence is rapid in the early iteration phase, which is evident in F27, F30 and F31.
These results show that the CMSSA's ability to balance exploration and exploration has been improved, especially for hybrid and composition problems.  Table  20 and 21, respectively. As it can be seen in Table 20 and 21, it is obvious that the CMSSA consuming time is slightly longer than the basic SSA. This is due to the introduction of four mutation operators (i.e., the improved Tent chaos map mutation operator, Lévy flights mutation operator, elite opposition-based learning mutation operator, and variable radius perturbation mutation operator) in CMSSA, which enhances the balance between exploitation and exploration of CMSSA algorithm. Therefore, it is reasonable to increase the time-consuming of CMSSA algorithm. Moreover, it is demonstrated that the wall-clock time cost of DA, SHO and MFO is longer than CMSSA on most of 23 benchmark functions, and that of GWOCS is also longer than CMSSA on 8 hybrid and composition benchmark functions. In general, CMSSA takes longer than many other algorithms. However, according to all the experimental results, the performance of CMSSA outperforms other algorithms in most cases. Therefore, it is very worthwhile to introduce a variety of mutation operators into the basic SSA to strengthen the performance of the algorithm.

(f) Statistical testing
Apart from standard statistical analysis such as mean, median and standard deviation, the Wilcoxon statistical test at 5% level of significance and the Friedman test are performed in a statistically significant way. The p-values of Wilcoxon statistical test are shown in Table 22. The statistical resluts of the Friedman test are tabulated in Table  23 and 24, respectively. Compared with other algorithms, it is observed from Table 22 that the p-value obtained from CMSSA is much smaller than 0.05 for most of 31 benchmark functions. Table 23 and 24 show that the AVG. value obtained from CMSSA is the smallest compared with other competitor approaches. Therefore, the results reveal that the proposed CMSSA is statistical different from other competitor algorithms, and outperforms other algorithms.
In conclusion, the discussions and findings in this part obviously illustrate the exploitation and exploration capability, local optimum avoidance, convergence behavior, statistical testing and wall-clock time cost of the CMSSA algorithm. Compared to other algorithms, the performance of the CMSSA algorithm is better than that of other algorithms in most cases. This is mainly due to the various mutation operators introduced into SSA to balance between the exploitation and exploration capability. In the following section, the applicability of the CMSSA algorithm is evaluated on 8 real-world problems with complex constraints.

E. CMSSA FOR CONSTRAINED REAL-WORLD OPTIMIZATION PROBLEMS
In this section, to effectively evaluate the applicability of CMSSA in terms of constraint handling in order to optimize constrained problems as well. 8 well-known engineering problems (i.e., gear train design problem, three-bar truss design problem, cantilever beam design problem, tension spring design problem, pressure vessel design problem, speed reducer problem, welded beam design problem and main girder design problem) were optimized by CMSSA. These problems have various constraints, so some constraint handling methods are used to solve these optimization problems. There are different types of penalty functions to deal with constraint problems [18] as follows:

(a) Gear train optimization design problem
According to the Ref. [116], the optimal design of gear train aims to find the minimum gear transmission ratio, making it closer to 1/6.931. The structure of the composite gear train was shown in Fig. 9. TA, TB, TD, and TF are the number of teeth on gears A, B, D, and F respectively. Because the number of teeth for each gear must be the integer between 12 and 60, the optimization problem is converted to a constrained optimization problem with discrete variables. The optimization problem is expressed by Eq. (22).
As it can be seen in Table 25, the CMSSA algorithm obtains the fourth optimal solutions as follows: the first group of optimal solutions is at X= (49,16,19,43) with a corresponding fitness value equal to fmin =0.14428097. The second group of optimal solutions is at X= (49,19,16,43) with a corresponding fitness value equal to fmin =0.14428097. The third group of optimal solutions is at X= (43,16,19,49) with a corresponding fitness value equal to fmin =0.14428097. The fourth group of optimal solutions is at X= (43,19,16,49) with corresponding fitness value equal to fmin=0.14428097. By observing Fig. 10, CMSSA converges towards the best solution using low computational efforts.
The results of CMSSA algorithms are superior to MIBBSQP, IDCNLP, SA, MVEP, Kannan BK, Gene AS, UPSO, CAPSO and BOA, and slightly inferior to GA, HSIA and CS. The results reveal that CMSSA is capable to effectively solve discrete problems as well.

(b) Three-bar truss optimization design problem
The objective function of the three-bar truss design is to minimize its weight with complex and nonlinear constraints of stress, deformation and buckling [11]. The structure of the three-bar truss design is depicted in Fig. 11. In this case, the design variables are the cross-sectional area A1 (x1) and A2 (x2), and the mathematical optimization model is illustrated by Eq. (23).  This problem is solved by CMSSA and compared to CS [11], SSA [20], DEDS [126], MBA [127], PSO-DE [128] and TAS [136] in literatures. The convergence curve of the three-bar truss problem is described in Fig. 12. The comparison results of the three-bar truss design problem are illustrated in Table 26. As can be seen in Table 26, the CMSSA algorithm provides the optimal solutions at X= (0.788671835599963,0.408258294610664) with a corresponding fitness value equal to fmin =263.895910694497. And the results obtained by the TAS method are infeasible due to violating one of the constraints.
The results indicate that the CMSSA algorithm attains very competitive results and its best solution obtained is superior to CS, and it also attains the best results close to SSA, DEDS, MBA and PSO-DE. These results demonstrate that the CMSSA algorithm can also effectively solve nonlinear constrained problems. (c) Cantilever beam optimization design problem As shown in Fig. 13, there are 5 structural parameters (i.e., the side length of the square-shaped cross-section Li (xi) (i=1,2,3,4,5)) in the cantilever beam problem. The objective is to minimize the weight of the cantilever beam with the vertical deformation constraints. The problem formulation is expressed by Eq. (24) [13]. The convergence curve of the cantilever beam design problem is presented in Fig. 14. And this problem has been solved by CMSSA and compared with GOA [18], ALO [14], MMA [129], GCA(I) [129], GCA(II) [129], CS [11] and SOS [13] in Table 27. The CMSSA obtains the optimal solution at X=(6.010729, 5.318938, 4.499154, 3.494689, 2.150293) with the corresponding fitness value equal to fmin =1.33996. The results show that CMSSA outperforms the MMA, GCA(I), GCA(II) and CS. And it is observed the best solution of CMSSA is equal to or close to those of GOA, SOS and ALO with the different optimal design parameters. Therefore, the CMSSA algorithm has some advantages for dealing with such problems.

(d) Tension spring optimization design problem
This problem is the tension spring design, and the objective is to obtain the minimum fabrication cost [130]. Fig. 15 shows the structure. There are three parameters: wire diameter ′ , mean coil diameter ′ , and number of active coils ′ . Under some complex constraints, the mathematical formulation of this problem is obtained by Eq. (25). There are some solutions obtained by using metaheuristic algorithms such as SSA [20], GSA [38], CPSO [130], ES [131], GA [132] and RO [133]. The convergence curve of the tension spring design problem is shown in Fig. 16. The best solutions of CMSSA are shown in Table 28 compared to those of all the abovementioned algorithms. The CMSSA attains the optimal solution at X=(0.0520769, 0.3661089, 10.7606604) with the with a corresponding fitness value equal to fmin = 0.0126699.
These results show that CMSSA outperforms other methods when dealing with this problem and attains the best design with the lowest cost.

(e) Pressure vessel optimization design problem
The pressure vessel design problem has four parameters (i.e., thickness of spherical shell , thickness of ball head ℎ , radius of spherical shell ′ and length of spherical shell ′ ). The structure and parameters are shown in Fig. 17. The objective of this problem is to minimize the fabrication cost with some constraints. The mathematical model of this problem is obtained by Eq. (26). The pressure vessel design problem is optimized by CMSSA and the results are compared to SMA [28], WOA [17], HHO [26], SHO [19] and MCOA [134]. The convergence curve and comparison results of this problem are shown in Fig. 18 and Table 29, respectively. The CMSSA provides the optimal solution at X=(0.778216, 0.384684, 40.323097, 199.954457) with a corresponding fitness value equal to fmin = 5885.4120443.
The results demonstrate that CMSSA can find the first low-cost design compared to other algorithms.

(f) Speed reducer optimization design problem
The speed reducer design problem has seven design variables as shown in Fig. 19. There are seven design variables (x1-x7) which represent the face width 1 (x1), module of teeth 1 (x2), a number of teeth in the pinion 2 (x3), length of the first shaft between bearings 2 (x4), length of the second shaft between bearings 3 (x5), the diameter of first shafts 1 (x6), and the diameter of second shaft 2 (x7), respectively. The objective of this problem is to attain the minimum construction cost of the speed reducer with the constraints of bending stress of the gear teeth, surface stress, transverse deflections of the shafts and stress in the shafts. The mathematical model of this problem is obtained by Eq. (27). The comparison results of the obtained optimal solution with other various competitors (i.e., CS [11], SHO [19], STOA [24], TAS [136], SBS [137], PSO-DE [128] and Ray and Sain [138]) are shown in Table 30 . . 1 ( ) = According to Table 30, the results obtained by the TAS and Ray and Sain methods violate the constraints and are infeasible. It is observed that the CMSSA algorithm can provide an optimal solution at X= (3.500081, 0.700032, 17, 7.323278, 7.737604, 3.350819, 5.286683) with a corresponding fitness value equal to fmin=2995.564917. The results indicate that the proposed CMSSA algorithm obtains the better results which outperform other competitors (i.e., CS, SHO, STOA, SBS and PSO-DE). The convergence analysis of the best optimal solution obtained by the CMSSA algorithm is shown in Fig. 20.

(g) Welded beam optimization design problem
The main objective of the welded beam problem is to minimize the fabrication cost. The simplified model of the welded beam design is described in Fig. 21. There are four design variables of this problem which can be described as the thickness of weld (h), length of the clamped bar (l), the height of the bar (t) and thickness of the bar (b), respectively. This problem is subjected to the constraints of shear stress in the beam, bending stress in the beam, buckling load on the beam and end deflection of the beam. The mathematical model of this problem is obtained by Eq. (28). The comparison results of the obtained optimal solution with other various competitors (i.e., MFO [15], WOA [17], RO [133], SHO [19], STOA [24], SOA [23] SSA [20], MVO [44] and GWO [12]) are shown in Table  31. As it can be seen Table 31, the CMSSA algorithm can attain the optimal solution at X= (0.205410, 3.258999, 9.036343, 0.2057659, 1.695799) with a corresponding fitness value equal to fmin=1.695799. The results show that the CMSSA algorithm is able to find the best optimal design compared to other algorithms (i.e., MFO, WOA, RO, SHO, STOA, SOA, SSA, MVO and GWO). By observing Fig. 22, the CMSSA algorithm can obtain the near-optimal solution in the initial iteration process.

(h) Main girder optimization design problem
The lightweight design of main girder in the bridge crane should meet the requirements of strength, stiffness and stability etc. The simplified structure of main girder is shown in Fig. 23. There are four design variables of this problem such as height of the main girder (d1), width of the main girder(d2), thickness of the web plate(d3) and thickness of the flange plate(d4), respectively. The mathematical model of this problem is obtained by Eq. (29). The comparison results of the obtained optimal solution with other various competitors (i.e., CGA [139], GA-AN2 [140] and Normal way [141]) are shown in Table 32. As it can be seen Table 32, the CMSSA algorithm can attain the optimal solution at X= (747.6007, 350, 5.000, 5.8333) with a corresponding fitness value equal to fmin=5779.6702. The results show that the CMSSA algorithm is able to find the best optimal design compared to other algorithms (i.e., CGA, GA-AN2 and Normal way). By observing Fig. 24, the CMSSA algorithm can obtain the near-optimal solution with low computational cost.
To sum up, the results of eight real-world engineering problems indicate that the CMSSA algorithm has high performance in solving various challenging problems. The optimization results show that the CMSSA algorithm has a better capability to handle different combinatorial optimization problems. Thus, the CMSSA algorithm is the best optimizer that can provide better optimization results with low computational cost and fast convergence rate.

V. CONCLUSION AND FUTURE WORK
To further improve the performance of SSA algorithm, this paper presents a new series of SSA variants, namely, ITSSA, LFSSA, EOBLSSA, VRSSA and CMSSA. All the proposed algorithms are tested on a set of thirty-one benchmark functions to evaluate the exploration and exploitation phases for avoiding local optimum. Initially, the results reveal that the CMSSA algorithm is the best among these variants.
Moreover, compared to 19 well-known optimization algorithms, the CMSSA algorithm is tested on thirty-one benchmark functions to analyze the exploration, exploitation, local optima avoidance, convergence behavior and time consuming. The results on these test functions show that the CMSSA algorithm is the best optimizer which provides very competitive results as compared to other optimizers. The statistical testing has been carried out to demonstrate the superiority of the CMSSA algorithm compared to other metaheuristics. In addition, the CMSSA algorithm has been employed to eight real-world constrained engineering design problem (i.e., gear train, three-bar truss, cantilever beam, tension spring, pressure vessel, speed reducer, welded beam and main girder) which demonstrates that the CMSSA algorithm has high performance capability in unknown search spaces. This paper puts forward several research directions like CMSSA algorithm may be applied to solve multi-objective optimization problems in future work. Also, Binary and multi-objective versions of the CMSSA algorithm can be seen as an interesting direction for future contribution.