Next Article in Journal / Special Issue
Hybrid Slime Mold and Arithmetic Optimization Algorithm with Random Center Learning and Restart Mutation
Previous Article in Journal
Clinical Effectiveness of 3D-Milled and 3D-Printed Zirconia Prosthesis—A Systematic Review and Meta-Analysis
Previous Article in Special Issue
Fault Diagnosis in Analog Circuits Using Swarm Intelligence
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Variable Step Crow Search Algorithm and Its Application in Function Problems

Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, School of Mechanical and Power Engineering, Harbin University of Science and Technology, Harbin 150080, China
*
Author to whom correspondence should be addressed.
Biomimetics 2023, 8(5), 395; https://doi.org/10.3390/biomimetics8050395
Submission received: 14 July 2023 / Revised: 16 August 2023 / Accepted: 24 August 2023 / Published: 28 August 2023

Abstract

:
Optimization algorithms are popular to solve different problems in many fields, and are inspired by natural principles, animal living habits, plant pollinations, chemistry principles, and physic principles. Optimization algorithm performances will directly impact on solving accuracy. The Crow Search Algorithm (CSA) is a simple and efficient algorithm inspired by the natural behaviors of crows. However, the flight length of CSA is a fixed value, which makes the algorithm fall into the local optimum, severely limiting the algorithm solving ability. To solve this problem, this paper proposes a Variable Step Crow Search Algorithm (VSCSA). The proposed algorithm uses the cosine function to enhance CSA searching abilities, which greatly improves both the solution quality of the population and the convergence speed. In the update phase, the VSCSA increases population diversities and enhances the global searching ability of the basic CSA. The experiment used 14 test functions,2017 CEC functions, and engineering application problems to compare VSCSA with different algorithms. The experiment results showed that VSCSA performs better in fitness values, iteration curves, box plots, searching paths, and the Wilcoxon test results, which indicates that VSCSA has strong competitiveness and sufficient superiority. The VSCSA has outstanding performances in various test functions and the searching accuracy has been greatly improved.

1. Introduction

The optimization is to give existing solutions and parameters to present a satisfactory answer for a certain problem. For quite some time, people have conducted large research on various optimization problems. Newton and Leibniz founded calculus which can solve some optimization problems. Then, different mathematical concepts have been proposed, such as the steepest descent method and the linear programming solution method, which can be used in many fields [1,2,3].
For specific problems, traditional methods have produced specific optimization methods for different problems. However, most of these methods have specific requirements for the searching space which requires objective functions to be convex, continuously differentiable, and differentiable. These weaknesses of traditional optimization methods are limited in solving many practical problems [4,5,6,7]. These practical production problems have large-scale, non-linear, multi-extreme values, characteristics of multiple constraints, and non-convexities, making traditional optimization methods difficult to conduct mathematical modeling. Therefore, exploring information processing methods with intelligent features is valuable.
In practical applications, intelligent algorithms generally do not require problem special information, the constraint on the problem, the continuity, the differentiability, the convexity of the objective function, and the analytical expression. Intelligent algorithms have strong adaptability to uncertainty data in the calculation process. At present, intelligence algorithms mainly include African Vultures Optimization Algorithm (AVOA) [8], Beluga Whale Optimization (BWO) [9], Whale Optimization Algorithm (WOA) [10], Flow Direction Algorithm (FDA) [11], Grey Wolf Optimizer (GWO) [12], Harris Hawks Optimizer (HHO) [13], Sine Cosine Algorithm (SCA) [14], Spotted Hyena Optimizer (SHO) [15], Slime Mould Algorithm (SMA) [16], Symbiotic Organisms Search (SOS) [17], Wild Horse Optimizer (WHO) [18], Geometric Mean Optimizer (GMO) [19], Golden Jackal Optimization algorithm (GJO) [20], Coati Optimization Algorithm (COA) [21], Dandelion Optimizer (DO) [22], Remora Optimization Algorithm (ROA) [23], Great Wall Construction Algorithm (GWCA) [24], Generalized Normal Distribution Optimization (GNDO) [25], Pelican Optimization Algorithm (POA) [26], and so on [27,28,29,30].
These algorithms have been achieved in various engineering fields [31,32,33,34,35,36]. For solving large-scale optimization problems, intelligent algorithms are significantly superior to traditional mathematical programming methods in terms of computational times and complexities.
Crow Search Algorithm (CSA) was proposed by Alireza Askarzadeh in 2016 [37]. Crows will hide their food and remember its hiding location for several months. At the same time, they will track other crows to steal food. Based on the living habits of crows in nature, the crow search algorithm has been proposed. From the algorithmic perspective, the overall flying area of the crow population is the searching space. The position of each crow represents the algorithm feasible solution and the location of crow hidden food represents the algorithm’s objective function value. The best food position in the algorithm is the optimal solution in the searching space.
Shalini Shekhawat and Akash Saxena designed the Intelligent Crow Search Algorithm (ICSA) and used ICSA in the structural design problem, frequency wave synthesis problem, and Model Order Reduction [38]. Yilin Chen et al. introduced a robust adaptive hierarchical learning Crow Search Algorithm for feature selection [39]. Primitivo Díaz et al. introduced an improved Crow Search Algorithm Applied to Energy Problems [40]. Amrit Kaur Bhullar et al. proposed the enhanced crow search algorithm for AVR optimization [41]. Thippa Reddy Gadekallu et al. used CNN-CNS for handing gesture classification [42]. Malik Braik et al. designed a hybrid crow search algorithm for solving numerical and constrained global optimization problems [43]. Behrouz Samieiyan et al. applied Promoted Crow Search Algorithm (PCSA) to solve dimension reduction problems [44]. Qingbiao Guo et al. used an improved crow search algorithm for the parameter inversion of the probability integral method [45]. CSA has been applied in many fields.
In the basic CSA, crows update their positions by the fixed flight length in the searching space, wherein the fixed flight length will make the individual jump out of the fitness solution region, which can cause low searching accuracy. As a result, this paper proposes a variable step crow search algorithm (VSCSA). VSCSA uses Cosine function steps to update its positions. The rest of this paper is organized as follows: In Section 2, this paper gives the basic CSA. In Section 3, this paper proposes VSCSA. In Section 4, the function experiment results analysis is shown. In Section 5, the CEC2017 function experiment results analysis is shown. In Section 6, engineering application problems are shown. In Section 7, the conclusion is given.

2. Crow Search Algorithm

The crow is the general name of passerine corvus that is a large songbird which has a sturdy mouth and feet. Nostrils are circular and usually covered by feather whiskers. Crows like to live in groups and have strong clustering. They are forest and grassland birds with a steady gait. Except for a few species, they often gather in groups and nest, and wander in mixed groups during the autumn and winter seasons, flying and singing. Generally, the personality is fierce and full of aggressive habits. CSA is a metaheuristic algorithm based on crow intelligent behaviors. Crows will steal food by observing where the other birds hide their food, if a crow finds the thief, it will move to hiding places to avoid being a future victim. And crows use their own experiences to predict the pilferer’s behavior. In CSA, the crow overall flight area is the searching space, and the position of each crow gives a feasible solution. The crow hidden food represents the quality of the algorithm function value.
The CSA Step is given in this section.
Step 1: Initialize the problem and adjustable parameters.
Set CSA size N, the maximum number of iterations itermax, the flight length (fl), the awareness probability AP, and the searching dimension is d. The crow i at one iteration in the searching space is specified by a vector xi,iter(i = 1, …, N; iter = 1, …, itermax). The searching upper bound is ubi(i = 1, …, N) and the searching lower bound is lbi(i = 1, …, N),
Step 2: Initialize position and memory.
Each crow will save its hidden food location m during each iteration, which represents the best position the crow currently has because during the initial iteration of the algorithm, the crow is inexperienced. Therefore, the initial memory, which is the location where the crow first hides its food, is set as their initial position.
C r o w s = [ x 1 1 x 2 1 x d 1 x 1 2 x 2 2 x d 2 x 1 N x 2 N x d N ]
M e m o r y = [ m 1 1 m 2 1 m d 1 m 1 2 m 2 2 m d 2 m 1 N m 2 N m d N ]
Step 3: Evaluate the objective function.
Compute one crow position.
Step 4: Generate a new position.
Crow i will generate a new position. In this case, two states will happen:
State 1: Crow i will approach crow j.
State 2: Crow j will go to another position.
States 1 and 2 can be expressed as follows:
x i , i t e r + 1 = { x i , i t e r + r i × f l i , i t e r × ( m j , i t e r x i , i t e r ) r j . i t e r A P j . i t e r a r a n d o m p o s i t i o n o t h e r w i s e
where ri is a random number in the range of [0, 1] and fli;iter denotes the flight length of crow i at iteration iter. AP denotes the awareness probability.
Step 5: Check the feasibility of new positions.
Check the new position feasibility of each crow.
Step 6: Evaluate fitness functions of new positions.
Calculate all feasible solutions. The function value for the new position of each crow will be calculated.
Step 7: Update memory
The crows update their memory as follows:
m i , i t e r + 1 = { x i , i t e r + 1 f ( x i , i t e r + 1 ) i s b e t t e r t h a n f ( m i , i t e r ) m i , i t e r o t h e r w i s e
Compare all fitness function values. If there is a better fitness function value of the new position, the memory will be updated.
Step 8: Check termination criterion.
Calculate iter = iter + 1. Stop if the termination criterion is met iter = itermax. If not, Steps 4–7 are repeated until the itermax is reached.

3. Variable Step Crow Search Algorithm

In the original CSA, crows constantly update their positions in the searching space, but their flight length fl is fixed, and the solutions to the search problem are diverse. When initializing a population, individuals often cannot directly locate the optimal solution and approach the region where the optimal solution is located without prior exploration experience in the searching space as a guide. Therefore, the searching process should be carried out in multiple different directions to expand the searching scope and thereby increase the probability of approaching the area. In addition, individuals in the population often wish to visit unexplored areas when exploring the searching space, thereby increasing the breadth of the search. And from the CSA position update formula, it can be seen that the crow population mainly updates its position by moving towards a fixed flight length. Therefore, as the species iteration continues, the crow population will gradually cluster and the population diversity will gradually decrease, which can easily lead to a single searching direction and form too many local optima which is not conducive to the algorithm’s small-scale search in the later stage. To solve this problem, this article proposes a variable step crow search algorithm (VSCSA).
The cosine function is a Periodic function with a minimum positive period of 2π. When the independent variable is an integer 2kπ (k is an integer), the function has a maximum value of 1. When the independent variable is (2k + 1) π (k is an integer), the function has a minimum value of −1. The cosine function is an even function, and its image is symmetric about the y-axis.
x n e w i , i t e r + 1 = x n e w i , i t e r + | cos ( r i ) | × ( m j , i t e r x n e w i , i t e r ) r j . i t e r A P j . i t e r
When crow j knows that crow i is following it, then as a result, crow j will go to another position by the searching upper bound.
x n e w i , i t e r + 1 = 0.5 × ( u b i × r s i + m j , i t e r )
New states 1 and 2 can be expressed as follows:
x n e w i , i t e r + 1 = { x n e w i , i t e r + | cos ( r i ) | × ( m j , i t e r x n e w i , i t e r ) r j . i t e r A P j . i t e r 0.5 × ( u b i × r s i + m j , i t e r ) o t h e r w i s e
where rsi is a random number in the range of [−1, 1] and ubi(i = 1, …, N) is the searching upper bound.
VSCSA improves the population diversity and the changing pattern search guidance method during the evolution process. In the early searching stage, the population diversity is relatively high. Cosine steps serve as a guide for population evolution, which can avoid blind individual searching and population diversity rapid decay. This meets the requirement that the algorithm should conduct a large-scale exploration as much as possible during the initial iteration. In the later searching stages, the proposed algorithm will shift from a global exploration to a local development, which can avoid the divergence in search directions. When the population falls into the local optima, the proposed algorithm can use individuals generated by cosine steps as searching guides to effectively increase population diversities and jump out of different local optima areas. Therefore, the proposed strategy reflects the adaptive interaction between population diversities and multiple search guided individuals, the changes in the population searching steps reflect different stages of evolution, and different searching guided methods can be adaptively selected. In turn, different guided methods can alter the diversity of the population, expand the algorithm searching range, and strengthen the algorithm searching precision.
The VSCSA Flowchart can be presented in Figure 1 as follows:
The VSCSA pseudo code can be summarized in Algorithm 1.
Algorithm 1: VSCSA.
Input: Function f(.). Searching upper bound and lower bound. Set itermax. Set iter = 1. Population size N. Evaluate the position of the crows. Initialize the memory of each crow.
While (iter < itermax)
For i = 1:N
Randomly choose one of the crows to follow (for example j).
Define an awareness probability.
If 1 rj,iter≥ APi.iter
x n e w i , i t e r + 1 = x n e w i , i t e r + |cos(ri)| × (mj,iter x n e w i , i t e r )
Else
x n e w i , i t e r + 1 = 0.5 × (ubi × rsi + mj,iter)
End If 1
End For
Check the feasibility of new positions.
Evaluate the new position of the crows.
Update the memory of crows:
If 2 f ( x n e w i , i t e r + 1 ) is better than f (mi,iter).
mi,iter+1 = x n e w i , i t e r + 1
Else
mi,iter+1 = mi,iter
End If 2
iter = iter + 1
End While

4. Function Experiment Results

4.1. Experiment Environments

Different functions are in Table 1. In Table 1, D is the searching dimension and fmin is the idea function value. Range is the searching scope. Different optimal solutions of high-dimensional testing functions in this paper are hidden in a smooth and narrow parabolic valley, with broad searching space, tall obstacles, and a large number of local minimum points. This paper uses different test functions for comparing VSCSA and standard CSA performances. This paper tests VSCSA with the cuckoo search algorithm (CS) [46], the sine cosine algorithm (SCA) [14], and the moth-flame optimization algorithm (MFO) [47]. CS was proposed by Xin-She Yang and was inspired by the cuckoo incubation mechanism in nature. The size of the cuckoo bird is similar to that of a pigeon, but it is slender and has a dark gray upper body. SCA, which was proposed by Seyedali Mirjalili in 2016, was inspired by sine and cosine mathematical terminology. MFO, which was proposed by Seyedali Mirjalili in 2015, was inspired by the moth navigation in nature called the transverse orientation. In this chapter, the CS discovery probability was set as 0.25 and the step was set as 0.25. For SCA, a = 2, r2 = 2π, r3, and r4 were selected in [0, 1]. In CSA, fl = 2. All algorithm parameters were selected from the original algorithm literature. The population size was 20, the maximum iterations were 400, and it was ran 10 times in MATLAB (R2016b).

4.2. Data Results

In Table 2 and Table 3, Min, Max, Ave, and Var mean the minimum value, the maximum value, the average value, and the variance deviation. Table 2 shows two-dimension function results. Table 3 shows high-dimension functions results. For two-dimension functions, VSCSA can obtain the ideal function values in f2 to f5, f12(D=2), and f13(D=2). And VSCSA can obtain the ideal values of all evaluation indexes in f2 to f4. CSA can obtain ideal function values in f12(D=2), f13(D=2). MFO can obtain the ideal function values in f2 to f5, f12(D=2), and f13(D=2). MFO can obtain the ideal values of all evaluation indexes in f2, f3, f5. SCA can obtain the ideal function values in f2 to f4, f12(D=2), and f13(D=2). SCA can obtain the ideal values of all evaluation indexes in f2 to f4. Min values of MFO in f10 and f14(D=2) are better than those of VSCSA. Min value of SCA in f14(D=2) is better than that of VSCSA. For high dimension functions, the Min values of SCA in f11(D=30), f12(D=60), f13(D=60), f13(D=200) are better than those of VSCSA. Min value of MFO in f12(D=30) is better than that of VSCSA. VSCSA in other results are all less than comparative algorithms. VSCSA can ensure continuous evolution and has good convergence speed and optimization accuracy. Especially for multi-peak high dimension functions with rotational characteristics, the proposed algorithm can better overcome the interference caused by local extreme points in the solving process, can prevent premature convergence, ensure continuous population evolution, and ultimately achieve a high optimization accuracy.

4.3. Iteration Results

This paper gave algorithm optimal iteration curves after 10 independent operations, as shown in Figure 2 and Figure 3. Compared with different algorithms in two-dimension iteration curves, VSCSA has the fastest iteration curve except for f10, f13(D=2), f14(D=2). In f10, SCA has the fastest iteration curve. In f13(D=2), f14(D=2), MFO has the fastest iteration curve, and CSA has the second fast iteration curve. Compared with different algorithms in high-dimension iteration curves, VSCSA has the fastest iteration curve except f11(D=30), f12(D=30), f12(D=60), f13(D=60). In f11(D=30), f12(D=30), f12(D=60), f13(D=60), SCA has the fastest iteration curve. The VSCSA has outstanding performances in various test functions, whereby especially the searching accuracy has been greatly improved. Therefore, the iteration curve can display that VSCSA has a strong searching performance.

4.4. Box Plot Results

The box plot connects the two quartiles and connects the upper and lower edges to draw the box plot, and the median is in the middle of the box plot. If the box plot is narrower, the data is more concentrated. This paper gave algorithm box plots, as shown in Figure 4 and Figure 5. Compared with different algorithms in low-dimension box plots, VSCSA has the narrowest box plot except for f8, f13. In f8 and f13, CSA has the narrowest box plot. Compared with different algorithms in high-dimension box plots, VSCSA has the narrowest box plot except for f11(D=60), f13(D=60), f11(D=200), f13(D=200). In f11(D=60), f13(D=60), f11(D=200), f13(D=200), CSA has the narrowest box plot. Compared to the standard CSA algorithm, the VSCSA algorithm not only has a higher solving accuracy but also runs faster in most testing functions, which fully demonstrates that the VSCSA retains outstanding local search ability and is a significant improvement in global searching performances.

4.5. Sub-Sequence Runs Results

Different axes are projected at equal angular intervals from the same center point, each axis represents a quantitative variable, and points on each axis are sequentially connected into lines or geometric shapes. Each variable has its axis, with equal distances between them, and all axes have the same scale. It is equivalent to a parallel coordinate map, which is arranged radially along the axis. This paper shows the basic statistical assessment obtained in sub-sequence runs of different algorithms. Ten sub-sequence runs are shown in Figure 6 and Figure 7. If the total length of polygon edges with different colors is longer, the lower the accuracy of the algorithm subsequence operations. For two-dimension amplification radar charts, CS subsequences have the largest radar charts except for f12(D=2). MFO radar charts are larger than CSA radar charts in f12(D=2). For high-dimension amplification radar charts, CS subsequences have the largest radar charts except for f11(D=60), f11(D=200) to f14(D=200).

4.6. Search Path Results

To test the structural reliability analysis, the computational efficiency, and the accuracy of the proposed algorithm, three-dimensional images of two-dimension functions are given in Figure 8, while the VSCSA path and the CSA path are refracted to a two-dimension plane in Figure 9. The red straight line is the VSCSA searching path, the green dashed line is the CSA searching path, and the pink dot is the theoretical optimal position. CSA searching paths have many short repeat searching paths and occasional long searching paths. The VSCSA algorithm has a strong performance in population diversity, representing the global optimal performance. In the early stage of the algorithm searching process, the VSCSA can quickly traverse and explore the entire solution region, lock in the approximate range of the global optimal solution, and ensure the diversity of the population. At the end stage of the algorithm searching process, the reduction of differences between individuals makes the searching process jump out of local vortices and find the ideal optimization solution, which can improve the algorithm global convergence ability.

4.7. Wilcoxon Rank Sum Test Results

In the process of detecting algorithms, more different experimental results often appear. When comparing and analyzing algorithms, conclusions cannot be drawn solely based on differences in a few results, so statistical analysis should be conducted to test the significance of differences in the data. The Wilcoxon rank sum test result is the p-value. If the p-value is greater than 0.05, there is no significant change for two sets of data. If the p-value is less than 0.05, two algorithm performances are significant. In Table 4, N means that the computer cannot give the p-value because of the too-large or too-small p-value. In function f8, f13(D=2), f11(D=30), f13(D=30), f11(D=60), f13(D=60), f11(D=200), f13(D=200), the p-value in CSA is larger than 0.05. In function f4, f10, f11(D=2), f13(D=2), f12(D=30), f13(D=30), the p-value in MFO is larger than 0.05. In function f11(D=2), f12(D=2), f11(D=30), f13(D=30), f13(D=60), the p-value in SCA is larger than 0.05. For other algorithms, the Wilcoxon rank sum test results are all less than 0.05. From the results of the Wilcoxon rank sum test by VSCSA, the searching accuracy of the algorithm has been significantly improved, and the improved algorithm is significantly better than the standard CSA in terms of searching accuracy and speed.

4.8. Algorithm Ranking Results

Algorithm ranking radar charts are shown in Figure 10. The positions of different colored dots in the radar image represent the algorithm searching accuracy. If the algorithm point is close to the center point, the algorithm has a high ranking. It can be seen that the VSCSA surrounds the center point. From the radar graph, it can be seen that VSCSA has the best results among multiple test functions and has the highest searching accuracy among comparison algorithms. Although VSCSA did not achieve comprehensive advantages in some test functions, it achieved optimal searching results in more than half of the test functions, indicating that VSCSA has strong competitiveness. It can be seen that the proposed VSCSA in this paper greatly enhances the CSA searching performance.

5. CEC2017 Test Function Experiment Results

5.1. Experiment Environments

The IEEE Congress on Evolutionary Computation (CEC) is one of the largest and most significant conferences within Evolutionary Computation (EC). CEC test functions under the CEC conference series are among the widely used benchmarks to test different algorithms. CEC2017 is the test function in the 2017 CEC conference. CEC2017 consists of different problems, including Unimodal, Multimodal, Hybrid, and Composition functions. To further show the proposed algorithm, this paper selected CEC2017 in F1 to F20. F1 to F20 of CEC2017 are given in Table 5. In Table 5, D is the searching dimension, Fmin is the idea function value, and range is the searching scope. F1 and F2 are Unimodal Functions, F3 to F9 are Simple Multimodal Functions, F10 to F19 are Hybrid Functions, and F20 is the Composition. In this paper, the proposed method compares with state-of-the-art algorithms (SOTA) in recent years. SOTA includes the bald eagle search algorithm (BES) [48], COOT bird algorithm (COOT) [49], wild horse optimizer (WHO) [18], and whale optimization algorithm (WOA) [10]. All algorithm parameters were selected according to original literature. The population size and the maximum number of iterations were 20 and 5000, respectively. To obtain a fair comparison result, all algorithms independently ran 10 times in MATLAB(R2016b). The experimental environment was the Windows 7 operating system, Intel (R) Core (TM) i3-7100CPU, 8GBRAM.

5.2. Experiment Results

The statistical results of algorithms on CEC2017 benchmark functions are shown in Table 6. In Table 6, Min, Max, and Var mean the minimum value, the maximum value, and the variance deviation. For Unimodal Functions, VSCSA, CSA, BES, COOT, and WHO can obtain the ideal value in F1. All six algorithms can obtain the ideal value in F2. For Simple Multimodal Functions, all six algorithms can obtain the ideal value in F3 to F9. For Hybrid Functions, all six algorithms can obtain the ideal value in F10 and F11. CSA can obtain the minimum value in F12. BES can obtain the minimum value in F17. COOT can obtain the minimum value in F14 F16. WHO can obtain the minimum value in F13 F15 F18 F19. For the Composition, COOT can obtain the minimum value in F20.
Figure 11 gives the best iteration curves of different algorithms in 10 independent runs. From Figure 11 we can see that VSCSA has the fastest initial iteration speed in function F1, F3, and F9. And VSCSA has the fastest iteration speed in the later stage for function F2, F4 to F8, and F10. VSCSA has the slowest iteration speed in F12, F15, and F16.
Figure 12 gives box plots for different algorithms after 10 independent runs. VSCSA has the narrowest box plot in function F2 to F9. CSA has the narrowest box plot in function F12, F13, and F18. BES has the narrowest box plot in function F15. COOT has the narrowest box plot in function F10, F11, F14, F16, F17, F19, and F20. VSCSA has the worst box plot in function F1 and F15. BES has the worst box plot in function F5, F16, F17, and F20. COOT has the worst box plot in function F7. WHO has the worst box plot in function F8. WOA has the worst box plot in function F6, F11, F12, F13, F14, F18, and F19. For F10, VSCSA, BES, WHO, and WOA have large box plots.
Figure 13 gives radar charts for different algorithms after 10 independent runs. For Figure 13, VSCSA subsequences have the largest radar charts for function F15. BES has the largest radar charts for function F5 and F8. WHO has the largest radar charts for function F6. WOA has the largest radar charts for function F1 to F4, F9, F12, F14, F18, and F19. For function F7, F10, F11, F13, F16, F17, and F20, many algorithms have large radar charts.
Table 7 shows the Wilcoxon rank sum test results. In Table 7, N means that the computer cannot give the p-value because of the too-large or too-small p-value. In function F7, F11, the p-value in CSA is larger than 0.05. In functions F6, F8, F10, F11, F13, F16, F17, F20, the p-value in BES is larger than 0.05. In function F1, F3, F6, F8, F9, F11, F13, F18, the p-value in COOT is larger than 0.05. In function F2, F5, F6, F8, F10, F11, F14, F17, the p-value in WHO is larger than 0.05. In function F10, F12, F10, F14, F16 to F20, the p-value in WOA is larger than 0.05. For other algorithms, the Wilcoxon rank sum test results are all less than 0.05.
Table 5. Basic information of CEC2017 benchmark functions.
Table 5. Basic information of CEC2017 benchmark functions.
No.FunctionDRangeFmin
F1Shifted and Rotated Bent Cigar Function2[−100, 100]100
F2Shifted and Rotated Zakharov Function2[−100, 100]200
F3Shifted and Rotated Rosenbrock’s Function2[−100, 100]300
F4Shifted and Rotated Rastrigin’s Function2[−100, 100]400
F5Shifted and Rotated Expanded Scaffer’s F6 Function2[−100, 100]500
F6Shifted and Rotated Lunacek Bi-Rastrigin Function2[−100, 100]600
F7Shifted and Rotated Non-Continuous Rastrigin’s Function2[−100, 100]700
F8Shifted and Rotated Levy Function2[−100, 100]800
F9Shifted and Rotated Schwefel’s Function2[−100, 100]900
F10Hybrid Function 1 (N = 3)2[−100, 100]1000
F11Hybrid Function 2 (N = 3)10[−100, 100]1100
F12Hybrid Function 3 (N = 3)10[−100, 100]1200
F13Hybrid Function 4 (N = 4)10[−100, 100]1300
F14Hybrid Function 5 (N = 4)10[−100, 100]1400
F15Hybrid Function 6 (N = 4)10[−100, 100]1500
F16Hybrid Function 6 (N = 5)10[−100, 100]1600
F17Hybrid Function 6 (N = 5)10[−100, 100]1700
F18Hybrid Function 6 (N = 5)10[−100, 100]1800
F19Hybrid Function 6 (N = 6)10[−100, 100]1900
F20Composition Function 1 (N = 3)10[−100, 100]2000
Table 6. Comparison of results for CEC2017 benchmark functions.
Table 6. Comparison of results for CEC2017 benchmark functions.
FunctionMetricVSCSACSABESCOOTWHOWOA
F1Min100.0000100.0000100.0000100.0000100.0000100.8089
Max100.4967100.0000100.0000100.00792476.93264991.5872
Var0.0239005.8721 × 10−65.6498 × 1053.0578 × 106
F2Min200.0000200.0000200.0000200.0000200.0000200.0020
Max200.0000200.0000200.0000200.0000200.0012200.0951
Var3.2226 × 10−13007.7463 × 10−111.6905 × 10−71.3605 × 10−3
F3Min300.0000300.0000300.0000300.0000300.0000300.0000
Max300.0000300.0000300.0000300.0000300.0000300.0000
Var0003.5902 × 10−2809.2862 × 10−22
F4Min400.0000400.0000400.0000400.0000400.0000400.0000
Max400.0000400.0000400.0000400.0000400.0000400.0000
Var0001.3381 × 10−216.9650 × 10−262.1903 × 10−14
F5Min500.0000500.0000500.0000500.0000500.0000500.0000
Max500.0000500.0000500.9950500.0000500.9950500.9950
Var000.264000.17600.1760
F6Min600.0000600.0000600.0000600.0000600.0000600.0000
Max600.0002600.0000600.0164600.0000600.1573600.0976
Var5.1834 × 10−902.6128 × 10−58.0042 × 10−110.00250.0009
F7Min700.0000700.0000702.0163700.0000700.9950700.0000
Max700.9950702.0163702.2136702.0163704.7119702.1708
Var0.09900.40660.00271.08420.87210.5292
F8Min800.0000800.0000800.0000800.0000800.0000800.0000
Max800.0000800.0000804.9748800.0000800.9950800.0000
Var002.46395.7443 × 10−270.23104.8755 × 10−24
F9Min900.0000900.0000900.0000900.0000900.0000900.0000
Max900.0000900.0000900.0000900.0000900.0000900.0000
Var0005.7443 × 10−2708.2264 × 10−14
F10Min1000.00001000.00001000.00001000.00001000.00001000.0000
Max1017.06941000.62431074.94961000.31221058.50451016.7572
Var71.93340.0444465.55780.0097337.627263.1675
F11Min1114.96241109.67151116.57321119.80841114.93681109.3504
Max1198.34031204.51141207.18551144.63531204.48321396.9177
Var663.03111057.8398958.198870.7817889.11151.0239 × 104
F12Min7.5856 × 1042604.25733128.56741.4793 × 1042.5114 × 1031.9748 × 104
Max9.7061 × 1053.6182 × 1044.0349 × 1045.7144 × 1053.8765 × 1041.0693 × 107
Var1.0431 × 10+111.1384 × 1081.4756 × 1084.7970 × 10+101.3494 × 1081.48761 × 10+13
F13Min3041.50271403.72631455.22291895.33411318.78132105.0934
Max1.8156 × 1042602.30913.1311 × 1042.5062 × 1041.3722 × 1045.2478 × 104
Var3.8814 × 1071.2894 × 1051.3522 × 1086.6879 × 1072.0050 × 1072.3650 × 108
F14Min1472.23381431.30351453.15411423.28211429.36261431.4869
Max2069.15001536.71922282.97651531.60192297.99095143.6059
Var3.1788 × 1041113.31256.3312 × 1041214.32786.3780 × 1042.2832 × 106
F15Min2102.08121561.66161514.53281530.41931501.44971755.3194
Max8248.54182380.66471.7944 × 1031835.00322035.34001.0501 × 104
Var4.5125 × 1066.0137 × 1046405.42679095.88872.7671 × 1047.1257 × 106
F16Min1785.99231605.14441614.92321604.02911614.10541703.0662
Max2058.85811962.82002319.64101980.67341989.29642086.8136
Var9501.89791.0527 × 1043.8956 × 1041.0911 × 1041.9357 × 1041.4474 × 104
F17Min1736.25061735.73991711.06711726.62211711.33991741.0307
Max1828.83091774.80881988.85441784.66241841.21461894.6790
Var756.9156102.03011.1332 × 104297.78412068.89562046.5244
F18Min2795.79781899.04151900.80645348.14821837.56303284.2186
Max2.6628 × 1044194.33979071.94933.2147 × 1048313.33845.2491 × 104
Var6.6663 × 1074.9868 × 1057.3939 × 1066.8506 × 1076.0211 × 1063.3418 × 108
F19Min2323.28781914.09981919.95891906.57961905.34762080.4660
Max5351.98202022.05242.6936 × 1032656.92163.2933 × 1042.4271 × 105
Var1.2781 × 1061145.46054.8170 × 1045.2873 × 1049.6078 × 1075.3813 × 109
F20Min2118.64682025.80652016.91422004.99542020.86262052.4900
Max2262.78142123.86472289.21352056.67732224.78622305.7172
Var1758.76891230.60499242.5172231.52313534.40725965.3790
Table 7. Comparison of the Wilcoxon rank sum test results in CEC2017 functions.
Table 7. Comparison of the Wilcoxon rank sum test results in CEC2017 functions.
FunctionCSABESCOOTWHOWOA
F10.000750.000750.519890.023840.00018
F20.000060.000060.037640.471000.00018
F3NN0.36812N0.00006
F4NN0.034980.014930.00006
F5N0.03359N0.167490.00023
F60.014930.571480.109570.399430.00069
F71.000000.000070.049810.000100.00012
F8N0.168080.368120.076720.00023
F9NN0.36812N0.00006
F100.009780.359090.004451.000000.96975
F110.520520.427360.344700.273040.01726
F120.000180.000180.045150.000180.42736
F130.000180.969850.427360.014020.03764
F140.001010.031210.000770.064020.30749
F150.000250.000180.000180.000180.00911
F160.009110.273040.003610.017260.47268
F170.014020.570750.004590.185880.14047
F180.000330.021130.273040.002830.12122
F190.000180.000330.000330.002830.05390
F200.000330.733730.000180.002200.34470

6. Engineering Applications

6.1. Three Bar Truss Design Problem

The three bar truss design problem is a civil engineering problem, and the weight of the bar structure is the key problem in the Gear Train Problem which owns a problematic and constrained space. The constraints of this problem are based on the stress constraints of each bar. Figure 14 is the structural diagram of the three bar truss design problem. A1 A2 A3 respectively represents the length of the bar, P means the force value, L means the space length.
This problem can be described mathematically as follows:
C o n s i d e r             X = [ x 1 x 2 ] = [ A 1 A 1 ] M i n i m i z e           f ( X ) = ( 2 2 x 1 + x 2 ) × L S u b j e c t     t o         g 1 ( X ) = 2 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 × P σ 0                                         g 2 ( X ) = x 2 2 x 1 2 + 2 x 1 x 2 × P σ 0                                           g 3 ( X ) = 1 2 x 2 + x 1 × P σ 0       0 x 1 , x 2 1 P = 2 KN / cm 2 L = 100 cm σ = 2 KN / cm 2
In this paper, basic CSA were selected for the CSA literature. Comparison algorithms and parameters selected the algorithm literature [50], with each method tested 30 times with 1000 iterations and a maximum of 60,000 number function evaluations (NFEs). The results of best, mean, minimum values, maximum values, and the standard deviation value are given in Table 8.
The VSCSA Min value is the same as the CSA Min value and the WHO Min value. The VSCSA Max value is larger than the CSA Max value. WHO and CSA obtain the less Max value and the Avg value. WHO obtains the less Std value. AEFA obtains the worst Min value, the Max value, the Std value, and the Avg value.

6.2. The Gear Train Problem

The cost of the gear ratio of the gear train is the key problem in the Gear Train Problem which owns only four parameters in boundary constraints. Four parameters are discrete because each gear should have an integral number of teeth in this problem. Discrete variables add different complexities for this problem. Figure 15 is the structural diagram of the Gear Train Problem. Four parameters are the numbers of teeth on the gears: nA, nB, nC, and nD. A, B, C, and D mean centre points.
This problem can be described mathematically as follows:
C o n s i d e r               X = [ x 1 x 2 x 3 x 4 ] M i n i m i z e               f ( X ) = ( 1 6.931 x 3 x 2 x 1 x 4 ) 2 S u b j e c t     t o         12 x 1 , x 2 , x 3 , x 4 60
In this paper, basic CSA was selected for the CSA literature. Comparison algorithms and parameters selected the algorithm literature, the number of population sizes is set to 50, and the maximum number of iterations is set to 1000. All algorithms are executed for 30 independent runs. The results of the best, mean, minimum values, maximum values, and the standard deviation value are given in Table 9. Comparison algorithms include CS [46], FPA [50], FSA [51], SA [52], and SCA [14]. The VSCSA Min value is the same as the CSA Min value. SCA obtains the worst Min value. SCA obtains the worst Max value, Std value, and Avg value. The VSCSA Min value, Max value, Std value, and Avg value are larger than those of CSA. There is no specific algorithm that can perfectly solve all engineering problems. Different algorithms can be selected for different engineering problems.

7. Conclusions

In this paper, VSCSA is introduced to solve function problems. The proposed algorithm uses the cosine function to enhance the CSA searching ability. VSCSA has strong problem applicability and can effectively find the global optimum in a short iteration period, greatly improving the solution accuracy. In conclusion, the proposed algorithm VSCSA has significant advantages over CSA in CEC-2017 fitness values, iteration curves, box plots, and search paths. In addition, the Wilcoxon test results statistically indicate differences between VSCSA and other comparative algorithms. Engineering applications show that the proposed algorithm has strong competitiveness. The above data and conclusions indicate that the improvement strategy proposed in this paper has achieved good results, greatly improving the performances of the original CSA. Many algorithms have been applied to specific fields such as medicine, aerospace, and industry and have achieved good results. Therefore, combining VSCSA with practical problems in specific fields is a direction for future research.

Author Contributions

Conceptualization, Y.F.; methodology, Y.F.; formal analysis, H.Y.; investigation, Z.X.; resources, Y.W. and Y.F.; writing—original draft preparation, Y.F.; writing—review and editing, H.Y. and Z.X.; visualization, D.L.; supervision, Y.W.; project administration, Y.W.; funding acquisition, Y.W and Y.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China, grant number 52175502, as well as the Natural Science Foundation of Heilongjiang Province, grant number LH2023E082, in addition to the basic research business fee projects of provincial undergraduate universities in Heilongjiang Province, grant number 2022-KYYWF-0144.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data used to support the findings of this study are included in this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ray, T.; Liew, K.M. Society and civilization: An optimization algorithm based on the simulation of social behavior. IEEE Trans. Evol. Comput. 2003, 7, 386–396. [Google Scholar] [CrossRef]
  2. Iba, K. Reactive power optimization by genetic algorithm. IEEE Trans. Power Syst. 1994, 9, 685–692. [Google Scholar] [CrossRef]
  3. Mafarja, M.; Aljarah, I.; Heidari, A.A.; Hammouri, A.I.; Faris, H.; Al-Zoubi, A.M.; Mirjalili, S. Evolutionary Population Dynamics and Grasshopper Optimization approaches for feature selection problems. Knowl. Based Syst. 2018, 145, 25–45. [Google Scholar] [CrossRef]
  4. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  5. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  6. Borchers, A.; Pieler, T. Programming pluripotent precursor cells derived from Xenopus embryos to generate specific tissues and organs. Genes 2010, 1, 413–426. [Google Scholar] [CrossRef]
  7. Storn, R.; Price, K. Differential Evolution-A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  8. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  9. Zhong, C.; Li, G.; Meng, Z. Beluga whale optimization: A novel nature-inspired metaheuristic algorithm. Knowl. Based Syst. 2022, 251, 109215. [Google Scholar] [CrossRef]
  10. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  11. Karami, H.; Anaraki, M.V.; Farzin, S.; Mirjalili, S. Flow Direction Algorithm (FDA): A Novel Optimization Approach for Solving Optimization Problems. Comput. Ind. Eng. 2021, 156, 107224. [Google Scholar] [CrossRef]
  12. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  13. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  14. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  15. Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications. Adv. Eng. Softw. 2017, 114, 48–70. [Google Scholar] [CrossRef]
  16. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  17. Cheng, M.-Y.; Prayogo, D. Symbiotic Organisms Search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
  18. Naruei, I.; Keynia, F. Wild horse optimizer: A new meta-heuristic algorithm for solving engineering optimization problems. Eng. Comput. 2021, 38, 3025–3056. [Google Scholar] [CrossRef]
  19. Rezaei, F.; Safavi, H.R.; Abd Elaziz, M.; Mirjalili, S. GMO: Geometric mean optimizer for solving engineering prob lems. Soft Comput. 2023, 27, 10571–10606. [Google Scholar] [CrossRef]
  20. Chopra, N.; Ansari, M.M. Golden jackal optimization: A novel nature-inspired optimizer for engineering applications. Expert Syst. Appl. 2022, 198, 116924. [Google Scholar] [CrossRef]
  21. Dehghani, M.; Trojovska, E.; Trojovsky, P.; Montazeri, Z. Coati Optimization Algorithm: A new bio-inspired metaheuristic algorithm for solving optimization problems. Knowl.-Based Syst. 2023, 259, 110011. [Google Scholar] [CrossRef]
  22. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Dandelion Optimizer: A nature-inspired metaheuristic algorithm for engineering applications. Eng. Appl. Artif. Intell. Int. J. Intell. Real-Time Autom. 2022, 114, 105075. [Google Scholar] [CrossRef]
  23. Jia, H.; Peng, X.; Lang, C. Remora Optimization Algorithm. Expert Syst. Appl. 2021, 185, 115665. [Google Scholar] [CrossRef]
  24. Guan, Z.; Ren, C.; Niu, J.; Wang, P.; Shang, Y. Great Wall Construction Algorithm: A novel meta-heuristic algorithm for engineer problems. Expert Syst. Appl. 2023, 233, 120905. [Google Scholar] [CrossRef]
  25. Zhang, Y.; Jin, Z.; Mirjalili, S. Generalized normal distribution optimization and its applications in parameter extraction of photovoltaic models. Energy Convers. Manag. 2020, 224, 113301. [Google Scholar] [CrossRef]
  26. Trojovsky, P.; Dehghani, M. Pelican Optimization Algorithm: A Novel Nature-Inspired Algorithm for Engineering Applications. Sensors 2022, 22, 855. [Google Scholar] [CrossRef] [PubMed]
  27. Moosavi, S.H.S.; Bardsiri, V.K. Satin bowerbird optimizer: A new optimization algorithm to optimize ANFIS for software development effort estimation. Eng. Appl. Artif. Intell. 2017, 60, 1–15. [Google Scholar] [CrossRef]
  28. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching–Learning-Based Optimization: An optimization method for continuous non-linear large scale problems. Inf. Sci. 2012, 183, 1–15. [Google Scholar] [CrossRef]
  29. Seyyedabbasi, A.; Kiani, F. Sand Cat swarm optimization: A nature-inspired algorithm to solve global optimization problems. Eng. Comput. 2022, 39, 2627–2651. [Google Scholar] [CrossRef]
  30. Jiang, X.; Lin, Z.; He, T.; Ma, X.; Ma, S.; Li, S. Optimal Path Finding With Beetle Antennae Search Algorithm by Using Ant Colony Optimization Initialization and Different Searching Strategies. IEEE Access 2020, 8, 15459–15471. [Google Scholar] [CrossRef]
  31. Pan, H.; Gong, J. Application of Particle Swarm Optimization (PSO) Algorithm in Determining Thermodynamics of Solid Combustibles. Energies 2023, 16, 5302. [Google Scholar] [CrossRef]
  32. Zandavi, S.M.; Chung, V.Y.Y.; Anaissi, A. Stochastic Dual Simplex Algorithm: A Novel Heuristic Optimization Algorithm. IEEE Trans. Cybern. 2021, 51, 2725–2734. [Google Scholar] [CrossRef] [PubMed]
  33. Liang, X.; Cai, Z.; Wang, M.; Zhao, X.; Chen, H.; Li, C. Chaotic oppositional sine–cosine method for solving global optimization problems. Eng. Comput. 2020, 38, 1223–1239. [Google Scholar] [CrossRef]
  34. Pazhaniraja, N.; Basheer, S.; Thirugnanasambandam, K.; Ramalingam, R.; Rashid, M.; Kalaivani, J. Multi-objective Boolean grey wolf optimization based decomposition algorithm for high-frequency and high-utility itemset mining. AIMS Math. 2023, 8, 18111–18140. [Google Scholar] [CrossRef]
  35. Huang, Z.; Li, F.; Zhu, L.; Ye, G.; Zhao, T. Phase Mask Design Based on an Improved Particle Swarm Optimization Algorithm for Depth of Field Extension. Appl. Sci. 2023, 13, 7899. [Google Scholar] [CrossRef]
  36. Akın, P. A new hybrid approach based on genetic algorithm and support vector machine methods for hyperparameter optimization in synthetic minority over-sampling technique (SMOTE). AIMS Math. 2023, 8, 9400–9415. [Google Scholar] [CrossRef]
  37. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  38. Shekhawat, S.; Saxena, A. Development and applications of an intelligent crow search algorithm based on opposition based learning. ISA Trans. 2020, 99, 210–230. [Google Scholar] [CrossRef] [PubMed]
  39. Chen, Y.; Ye, Z.; Gao, B.; Wu, Y.; Yan, X.; Liao, X. A Robust Adaptive Hierarchical Learning Crow Search Algorithm for Feature Selection. Electronics 2023, 12, 3123. [Google Scholar] [CrossRef]
  40. Díaz, P.; Pérez-Cisneros, M.; Cuevas, E.; Avalos, O.; Gálvez, J.; Hinojosa, S.; Zaldivar, D. An Improved Crow Search Algorithm Applied to Energy Problems. Energies 2018, 11, 571. [Google Scholar] [CrossRef]
  41. Bhullar, A.K.; Kaur, R.; Sondhi, S. Enhanced crow search algorithm for AVR optimization. Soft Comput. 2020, 24, 11957–11987. [Google Scholar] [CrossRef]
  42. Gadekallu, T.R.; Alazab, M.; Kaluri, R.; Maddikunta, P.K.R.; Bhattacharya, S.; Lakshmanna, K.; M, P. Hand gesture classification using a novel CNN-crow search algorithm. Complex Intell. Syst. 2021, 7, 1855–1868. [Google Scholar] [CrossRef]
  43. Braik, M.; Al-Zoubi, H.; Ryalat, M.; Sheta, A.; Alzubi, O. Memory based hybrid crow search algorithm for solving numerical and constrained global optimization problems. Artif. Intell. Rev. 2022, 56, 27–99. [Google Scholar] [CrossRef]
  44. Samieiyan, B.; MohammadiNasab, P.; Mollaei, M.A.; Hajizadeh, F.; Kangavari, M. Solving dimension reduction problems for classification using Promoted Crow Search Algorithm (PCSA). Computing 2022, 104, 1255–1284. [Google Scholar] [CrossRef]
  45. Guo, Q.; Chen, H.; Luo, J.; Wang, X.; Wang, L.; Lv, X.; Wang, L. Parameter inversion of probability integral method based on improved crow search algorithm. Arab. J. Geosci. 2022, 15, 180. [Google Scholar] [CrossRef]
  46. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  47. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  48. Alsattar, H.A.; Zaidan, A.A.; Zaidan, B.B. Novel meta-heuristic bald eagle search optimisation algorithm. Artif. Intell. Rev. 2020, 53, 2237–2264. [Google Scholar] [CrossRef]
  49. Naruei, I.; Keynia, F. A New Optimization Method Based on Coot Bird Natural Life Model. Expert Syst. Appl. 2021, 183, 115352. [Google Scholar] [CrossRef]
  50. Yang, X.-S.; Karamanoglu, M.; He, X. Flower pollination algorithm: A novel approach for multiobjective optimiza tion. Eng. Optim. 2014, 46, 1222–1237. [Google Scholar] [CrossRef]
  51. Elsisi, M. Future search algorithm for optimization. Evol. Intell. 2018, 12, 21–31. [Google Scholar] [CrossRef]
  52. Osman, I.H. Metastrategy simulated annealing and tabu search algorithms for the vehicle routing problem. Ann. Oper. Res. 1993, 41, 421–451. [Google Scholar] [CrossRef]
Figure 1. The VSCSA Flowchart.
Figure 1. The VSCSA Flowchart.
Biomimetics 08 00395 g001
Figure 2. Iteration curves of two−dimension functions. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Figure 2. Iteration curves of two−dimension functions. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Biomimetics 08 00395 g002
Figure 3. Iteration curves of high−dimension functions. (a) f11(D=30); (b) f12(D=30); (c) f13(D=30); (d) f14(D=30); (e) f11(D=60); (f) f12(D=60); (g) f13(D=60); (h) f14(D=60); (i) f11(D=200); (j) f12(D=200); (k) f13(D=200); (l) f14(D=200).
Figure 3. Iteration curves of high−dimension functions. (a) f11(D=30); (b) f12(D=30); (c) f13(D=30); (d) f14(D=30); (e) f11(D=60); (f) f12(D=60); (g) f13(D=60); (h) f14(D=60); (i) f11(D=200); (j) f12(D=200); (k) f13(D=200); (l) f14(D=200).
Biomimetics 08 00395 g003
Figure 4. Box plot charts of two−dimension functions. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Figure 4. Box plot charts of two−dimension functions. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Biomimetics 08 00395 g004
Figure 5. Box plot charts of high−dimension functions. (a) f11(D=30); (b) f12(D=30); (c) f13(D=30); (d) f14(D=30); (e) f11(D=60); (f) f12(D=60); (g) f13(D=60); (h) f14(D=60); (i) f11(D=200); (j) f12(D=200); (k) f13(D=200); (l) f14(D=200).
Figure 5. Box plot charts of high−dimension functions. (a) f11(D=30); (b) f12(D=30); (c) f13(D=30); (d) f14(D=30); (e) f11(D=60); (f) f12(D=60); (g) f13(D=60); (h) f14(D=60); (i) f11(D=200); (j) f12(D=200); (k) f13(D=200); (l) f14(D=200).
Biomimetics 08 00395 g005
Figure 6. Sub-sequence runs radar charts of two−dimension functions. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Figure 6. Sub-sequence runs radar charts of two−dimension functions. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Biomimetics 08 00395 g006
Figure 7. Sub-sequence runs radar charts of high−dimension functions. (a) f11(D=30); (b) f12(D=30); (c) f13(D=30); (d) f14(D=30); (e) f11(D=60); (f) f12(D=60); (g) f13(D=60); (h) f14(D=60); (i) f11(D=200); (j) f12(D=200); (k) f13(D=200); (l) f14(D=200).
Figure 7. Sub-sequence runs radar charts of high−dimension functions. (a) f11(D=30); (b) f12(D=30); (c) f13(D=30); (d) f14(D=30); (e) f11(D=60); (f) f12(D=60); (g) f13(D=60); (h) f14(D=60); (i) f11(D=200); (j) f12(D=200); (k) f13(D=200); (l) f14(D=200).
Biomimetics 08 00395 g007
Figure 8. Three−dimension images of two-dimension functions. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Figure 8. Three−dimension images of two-dimension functions. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Biomimetics 08 00395 g008
Figure 9. Search paths. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Figure 9. Search paths. (a) f1; (b) f2; (c) f3; (d) f4; (e) f5; (f) f6; (g) f7; (h) f8; (i) f9; (j) f10; (k) f11(D=2); (l) f12(D=2); (m) f13(D=2); (n) f14(D=2).
Biomimetics 08 00395 g009
Figure 10. Algorithm ranking figures. (a) Two-dimension functions. (b) High-dimension functions.
Figure 10. Algorithm ranking figures. (a) Two-dimension functions. (b) High-dimension functions.
Biomimetics 08 00395 g010
Figure 11. Iteration curves of CEC2017 functions. (a) F1; (b) F2; (c) F3; (d) F4; (e) F5; (f) F6; (g) F7; (h) F8; (i) F9; (j) F10; (k) F11; (l) F12; (m) F13; (n) F14; (o) F15; (p) F16; (q) F17; (r) F18; (s) F19; (t) F20.
Figure 11. Iteration curves of CEC2017 functions. (a) F1; (b) F2; (c) F3; (d) F4; (e) F5; (f) F6; (g) F7; (h) F8; (i) F9; (j) F10; (k) F11; (l) F12; (m) F13; (n) F14; (o) F15; (p) F16; (q) F17; (r) F18; (s) F19; (t) F20.
Biomimetics 08 00395 g011aBiomimetics 08 00395 g011b
Figure 12. Box plot charts of CEC2017 functions. (a) F1; (b) F2; (c) F3; (d) F4; (e) F5; (f) F6; (g) F7; (h) F8; (i) F9; (j) F10; (k) F11; (l) F12; (m) F13; (n) F14; (o) F15; (p) F16; (q) F17; (r) F18; (s) F19; (t) F20.
Figure 12. Box plot charts of CEC2017 functions. (a) F1; (b) F2; (c) F3; (d) F4; (e) F5; (f) F6; (g) F7; (h) F8; (i) F9; (j) F10; (k) F11; (l) F12; (m) F13; (n) F14; (o) F15; (p) F16; (q) F17; (r) F18; (s) F19; (t) F20.
Biomimetics 08 00395 g012aBiomimetics 08 00395 g012b
Figure 13. Sub-sequence runs radar charts of CEC2017 functions. (a) F1; (b) F2; (c) F3; (d) F4; (e) F5; (f) F6; (g) F7; (h) F8; (i) F9; (j) F10; (k) F11; (l) F12; (m) F13; (n) F14; (o) F15; (p) F16; (q) F17; (r) F18; (s) F19; (t) F20.
Figure 13. Sub-sequence runs radar charts of CEC2017 functions. (a) F1; (b) F2; (c) F3; (d) F4; (e) F5; (f) F6; (g) F7; (h) F8; (i) F9; (j) F10; (k) F11; (l) F12; (m) F13; (n) F14; (o) F15; (p) F16; (q) F17; (r) F18; (s) F19; (t) F20.
Biomimetics 08 00395 g013aBiomimetics 08 00395 g013b
Figure 14. Three bar truss design problem.
Figure 14. Three bar truss design problem.
Biomimetics 08 00395 g014
Figure 15. Gear train problem.
Figure 15. Gear train problem.
Biomimetics 08 00395 g015
Table 1. Basic information of benchmark functions.
Table 1. Basic information of benchmark functions.
NameFunctionDRangefmin
Bealef1(x) = (1.5 − x 1 + x 1 x 2 )2 + (2.25 − x 1 + x 1 x 2 2 )2 +
(2.625 − x 1 + x 1 x 2 3 )2
2[−50, 50]0
Bohachevsky01f2(x) = x 1 2 + 2 x 2 2 − 0.3cos(3π x 1 ) − 0.4cos(4π x 2 ) + 0.72[−50, 50]0
Bohachevsky02f3(x) = x 1 2 + 2 x 2 2 − 0.3cos(3π x 1 ) cos(4π x 2 ) + 0.32[−50, 50]0
Bohachevsky03f4(x) = x 1 2 + 2 x 2 2 − 0.3cos(3πx1 + 4π x 2 ) + 0.32[−50, 50]0
Boothf5(x) = ( x 1 + 2x2 − 7)2 + (2x1 + x2 5)22[−50, 50]0
Brentf6(x) = ( x 1 + 10)2 + ( x 2 + 10)2 + e x 1 2 x 2 2 2[−50, 50]0
Cubef7(x) = 100(x2 x 1 3 )2 + (1 − x 1 )22[−50, 50]0
Leonf8 (x) = 100(x2 x 1 2 )2 + (1 − x 1 )22[−50, 50]0
Levy13f9(x) = sin2(3πx1) + (x1 − 1)2[1 + sin2(3πx2)] + (x2 − 1)2[1 + sin2(2πx2)]2[−50, 50]0
Matyasf10(x) = 0.26( x 1 2 + x 2 2 ) − 0.48 x 1 x 2 2[−50, 50]0
Ackley 01 f 11 ( x ) = 20 exp ( 0.2 1 D i = 1 D x i 2 ) exp ( 1 D i = 1 D cos ( 2 π x i ) ) + 20 + exp ( 1 ) 2/30/60/200[−20, 20]0
Griewank f 12 ( x ) = i = 1 D ( x i 2 / 4000 ) i = 1 D cos ( x i / i ) + 1 2/30/60/200[−20, 20]0
Rastrigin f 13 ( x ) = 10 D + i = 1 D ( x i 2 10 cos ( 2 π x i ) ) 2/30/60/200[−20, 20]0
Sphere f 14 ( x ) = i = 1 D x i 2 2/30/60/200[−20, 20]0
Table 2. Comparison of results for two-dimension functions.
Table 2. Comparison of results for two-dimension functions.
FunctionMetricVSCSACSACSMFOSCA
f1Min1.2634 × 10−311.1569 × 10−160.04368.2737 × 10−220.0001
Max6.4206 × 10−168.2586 × 10−151.82080.03580.0035
Ave7.9525 × 10−172.6914 × 10−150.43930.00360.0009
Var4.1382 × 10−327.2893 × 10−300.25940.00011.2328 × 10−6
f2Min02.2204 × 10−160.585200
Max02.6645 × 10−143.415700
Ave07.3053 × 10−151.716900
Var07.5842 × 10−291.271100
f3Min01.6653 × 10−160.091000
Max03.9351 × 10−122.076700
Ave05.8736 × 10−131.145800
Var01.6012 × 10−240.427300
f4Min05.5511 × 10−170.400800
Max06.8778 × 10−143.97913.3307 × 10−160
Ave01.5071 × 10−141.52796.6613 × 10−170
Var04.8923 × 10−281.40341.3559 × 10−320
f5Min02.2380 × 10−170.152908.5105 × 10−5
Max1.4374 × 10−159.8969 × 10−155.206200.0072
Ave1.4374 × 10−161.3184 × 10−152.398200.0015
Var2.0660 × 10−319.2626 × 10−302.578404.1821 × 10−6
f6Min1.3839 × 10−871.2181 × 10−170.29781.3839 × 10−870.0001
Max1.2738 × 10−211.3241 × 10−152.29161.3839 × 10−870.0554
Ave1.3301 × 10−224.7342 × 10−160.94351.3839 × 10−870.0258
Var1.6096 × 10−432.2174 × 10−310.45555.5373 × 10−2060.0005
f7Min1.9671 × 10−176.8577 × 10−150.25250.00020.0002
Max0.00051.8490 × 10−1115.74267.19920.0056
Ave9.1750 × 10−52.2728 × 10−125.71191.58940.0025
Var2.3620 × 10−83.2677 × 10−2336.22067.28113.1310 × 10−6
f8Min3.2519 × 10−277.0257 × 10−150.89620.00640.0001
Max5.7350 × 10−62.4108 × 10−1226.258239.35290.0368
Ave9.2305 × 10−75.0263 × 10−137.55325.28970.0104
Var4.0001 × 10−126.7474 × 10−2552.26551.5027 × 1020.0002
f9Min1.3498 × 10−312.5846 × 10−160.26711.3498 × 10−310.0002
Max1.9689 × 10−147.3082 × 10−144.10461.3498 × 10−310.0099
Ave2.9178 × 10−151.6250 × 10−141.85691.3498 × 10−310.0033
Var4.3618 × 10−295.1105 × 10−280.986501.0867 × 10−5
f10Min1.7336 × 10−382.2204 × 10−160.00144.8795 × 10−505.0354 × 10−54
Max2.4876 × 10−292.0117 × 10−130.25671.6616 × 10−106.4181 × 10−41
Ave2.6360 × 10−302.5902 × 10−140.05441.6789 × 10−117.4082 × 10−42
Var6.1178 × 10−593.8412 × 10−270.00572.7550 × 10−214.0702 × 10−82
f11(D=2)Min8.8818 × 10−165.5532 × 10−90.46598.8818 × 10−168.8818 × 10−16
Max4.4409 × 10−155.5989 × 10−82.79312.57998.8818 × 10−16
Ave1.2434 × 10−151.8217 × 10−81.71490.25808.8818 × 10−16
Var1.2622 × 10−303.0345 × 10−160.51550.66560
f12(D=2)Min000.008900
Max0.00740.00990.01500.03950.0085
Ave0.00150.00450.01150.01450.0016
Var9.7247 × 10−61.6088 × 10−54.6345 × 10−61.8438 × 10−41.1916 × 10−5
f13(D=2)Min002.502700
Max0.99500.99505.52281.98990
Ave0.48690.09954.42640.39800
Var0.26440.09901.10620.48400
f14(D=2)Min9.4793 × 10−393.1793 × 10−180.02393.2958 × 10−781.0019 × 10−57
Max8.9804 × 10−311.4926 × 10−160.77881.8724 × 10−191.0246 × 10−40
Ave1.0017 × 10−315.2384 × 10−170.19871.8724 × 10−201.0247 × 10−41
Var7.9344 × 10−623.3914 × 10−330.05213.5060 × 10−391.0497 × 10−81
Table 3. Comparison of results for high dimension functions.
Table 3. Comparison of results for high dimension functions.
FunctionMetricVSCSACSACSMFOSCA
f11(D=30)Min2.27973.239716.381910.07410.6109
Max5.11366.011217.764916.52017.2651
Ave3.77784.675917.165813.70822.8626
Var0.63310.94350.29704.77363.6873
f12(D=30)Min0.05280.13641.21980.03000.0139
Max0.35870.32081.45710.57750.7466
Ave0.13780.21981.36530.22460.3989
Var0.00700.00560.00580.03880.0677
f13(D=30)Min1.1286 × 1021.3475 × 1021.4140 × 1031.5609 × 1021.1746 × 102
Max2.5911 × 1022.0885 × 1022.0780 × 1039.2546 × 1022.5009 × 102
Ave1.7758 × 1021.6834 × 1021.7499 × 1033.2263 × 1021.8308 × 102
Var1.7198 × 1037.5248 × 1024.3256 × 1046.8344 × 1042.4061 × 103
f14(D=30)Min0.15070.87301.1071 × 1032.58170.8212
Max0.43302.85461.8329 × 1038.0147 × 10259.5845
Ave0.26531.82251.5994 × 1033.4198 × 10216.5211
Var0.00750.45995.5006 × 1049.3346 × 1042.8519 × 102
f11(D=60)Min4.14314.195816.788414.58565.0868
Max6.64705.719218.227718.198310.4431
Ave5.18254.854817.661316.70677.8758
Var0.62360.41060.29531.79583.1980
f12(D=60)Min0.21010.32621.71401.08470.0276
Max0.34110.63272.14141.43131.1034
Ave0.25240.49771.89791.24500.8115
Var0.00210.00870.01250.01430.1153
f13(D=60)Min3.3309 × 1023.7876 × 1023.6370 × 1038.6975 × 1022.1715 × 102
Max6.2218 × 1026.1118 × 1025.3243 × 1033.9577 × 1037.7275 × 102
Ave5.0434 × 1024.7404 × 1024.3379 × 1032.0584 × 1034.8009 × 102
Var9.9127 × 1036.2803 × 1034.1675 × 1051.1259 × 1063.6050 × 104
f14(D=60)Min2.733117.90292.6427 × 1032.9806 × 1021.2890 × 102
Max5.527927.11714.5444 × 1031.5615 × 1034.8950 × 102
Ave4.048221.15753.7431 × 1039.2456 × 1022.4631 × 102
Var0.89318.18253.7021 × 1051.8256 × 1051.6048 × 104
f11(D=200)Min4.97215.432916.439918.51248.4967
Max6.47286.262218.260518.972211.7968
Ave5.57355.729817.639418.76939.9552
Var0.23700.04810.28900.01611.2549
f12(D=200)Min0.57300.80613.62563.97391.1261
Max0.66740.97955.19794.81492.0141
Ave0.61970.90204.41424.29531.5740
Var0.00130.00350.19570.06260.0719
f13(D=200)Min1.8775 × 1031.9453 × 1031.5111 × 1041.5274 × 1041.7142 × 103
Max2.5561 × 1032.3118 × 1031.8056 × 1041.7675 × 1044.6564 × 103
Ave2.2061 × 1032.1896 × 1031.6534 × 1041.6121 × 1043.2054 × 103
Var3.4431 × 1041.1518 × 1041.1715 × 1069.4756 × 1051.0841 × 106
f14(D=200)Min42.84331.5256 × 1021.0562 × 1041.2646 × 1041.3509 × 103
Max62.50812.0567 × 1021.6804 × 1041.5405 × 1045.0878 × 103
Ave52.13701.8448 × 1021.3558 × 1041.4005 × 1043.1410 × 103
Var33.51723.3378 × 1023.1386 × 1066.1661 × 1051.8390 × 106
Table 4. Comparison of the Wilcoxon rank sum test results.
Table 4. Comparison of the Wilcoxon rank sum test results.
FunctionCSACSMFOSCA
f10.000330.000180.001310.00018
f26.39 × 10−56.39 × 10−5NN
f36.39 × 10−56.39 × 10−5NN
f46.39 × 10−56.39 × 10−50.07758N
f50.002190.000180.000230.00018
f60.000180.000180.002210.00018
f70.002830.000180.000330.00033
f80.273040.000180.000180.00018
f90.004430.000170.005970.00017
f100.000180.000180.791340.00018
f11(D=2)0.000090.000091.000000.36812
f12(D=2)0.019030.000130.019140.88154
f13(D=2)0.877660.000150.602550.01429
f14(D=2)0.000180.000180.002830.00018
f11(D=30)0.075660.000180.000180.06402
f12(D=30)0.011330.000180.677580.03121
f13(D=30)0.733730.000180.185880.79134
f14(D=30)0.000180.000180.000180.00018
f11(D=60)0.570750.000180.000180.00283
f12(D=60)0.000330.000180.000180.00283
f13(D=60)0.427360.000180.000180.62318
f14(D=60)0.000180.000180.000180.00018
f11(D=200)0.140470.000180.000180.00018
f12(D=200)0.000180.000180.000180.00018
f13(D=200)0.850110.000180.000180.02113
f14(D=200)0.000180.000180.000180.00018
Table 8. Results of three bar truss problem.
Table 8. Results of three bar truss problem.
AlgorithmMinMax Std Avg
WHO263.8958433765 263.8958433765 1.2710574865 × 10−13263.8958433765
PSO263.8958433827 263.8960409745 5.3917161119 × 10−5263.8959010895
GA263.8958919373 263.9970875475 0.0252055577 263.9095296976
AEFA265.1001279647 280.9534461900 4.0558625686 271.8733092380
FA263.8958477145 263.8989975836 8.8455344984 × 10−4263.8964634153
GSA263.8968857660 264.1972851298 0.0948941056 264.0059193538
HHO263.8959528570 264.0672685182 0.0467621287 263.9419743129
MVO263.8958747019 263.9000377233 9.8601397499 × 10−4263.8967256362
WOA263.8959383525 265.6916186134 0.5029074306 264.3105859277
SSA263.8958435096 263.8998220362 7.2678747873 × 10−4263.8962415757
GWO263.8959818300 263.9028435626 0.0014371714 263.8975822284
CSA263.8958433765 263.8958433765 6.4741204424 × 10−12263.8958433765
VSCSA263.8958433765 263.9145156687 0.0037434952 263.8981466437
Table 9. Results of the gear train design problem.
Table 9. Results of the gear train design problem.
AlgorithmMinMax Std Avg
CS2.7008571489 × 10−128.7008339998 × 10−92.5469034697 × 10−92.5277681200 × 10−9
FPA2.3078157333 × 10−111.3616491391 × 10−95.1819924289 × 10−105.5155436237 × 10−10
FSA1.0935663792 × 10−94.4677248806 × 10−78.5620977463 × 10−84.7845971457 × 10−8
SA2.3078157333 × 10−111.3616491391 × 10−94.8777877665 × 10−106.1683323242 × 10−10
SCA3.6358329757 × 10−92.0768133383 × 10−14.9002989331 × 10−21.6613443644 × 10−2
CSA2.7008571489 × 10−122.3576406580 × 10−95.5363138249 × 10−102.7032649321 × 10−10
VSCSA2.7008571489 × 10−122.7264505977 × 10−87.3324954585 × 10−94.4138792095 × 10−9
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fan, Y.; Yang, H.; Wang, Y.; Xu, Z.; Lu, D. A Variable Step Crow Search Algorithm and Its Application in Function Problems. Biomimetics 2023, 8, 395. https://doi.org/10.3390/biomimetics8050395

AMA Style

Fan Y, Yang H, Wang Y, Xu Z, Lu D. A Variable Step Crow Search Algorithm and Its Application in Function Problems. Biomimetics. 2023; 8(5):395. https://doi.org/10.3390/biomimetics8050395

Chicago/Turabian Style

Fan, Yuqi, Huimin Yang, Yaping Wang, Zunshan Xu, and Daoxiang Lu. 2023. "A Variable Step Crow Search Algorithm and Its Application in Function Problems" Biomimetics 8, no. 5: 395. https://doi.org/10.3390/biomimetics8050395

Article Metrics

Back to TopTop