Next Article in Journal
Experimental Study on the Adhesion of Abalone to Surfaces with Different Morphologies
Previous Article in Journal
A Random Particle Swarm Optimization Based on Cosine Similarity for Global Optimization and Classification Problems
Previous Article in Special Issue
Research on Microgrid Optimal Dispatching Based on a Multi-Strategy Optimization of Slime Mould Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multi-Strategy Boosted Fick’s Law Algorithm for Engineering Optimization Problems and Parameter Estimation

1
Department of Applied Mathematics, Xi’an University of Technology, Xi’an 710054, China
2
Computer Network Information Center, Xi’an University of Technology, Xi’an 710048, China
*
Author to whom correspondence should be addressed.
Biomimetics 2024, 9(4), 205; https://doi.org/10.3390/biomimetics9040205
Submission received: 18 February 2024 / Revised: 21 March 2024 / Accepted: 22 March 2024 / Published: 28 March 2024

Abstract

:
To address the shortcomings of the recently proposed Fick’s Law Algorithm, which is prone to local convergence and poor convergence efficiency, we propose a multi-strategy improved Fick’s Law Algorithm (FLAS). The method combines multiple effective strategies, including differential mutation strategy, Gaussian local mutation strategy, interweaving-based comprehensive learning strategy, and seagull update strategy. First, the differential variation strategy is added in the search phase to increase the randomness and expand the search degree of space. Second, by introducing the Gaussian local variation, the search diversity is increased, and the exploration capability and convergence efficiency are further improved. Further, a comprehensive learning strategy that simultaneously updates multiple individual parameters is introduced to improve search diversity and shorten the running time. Finally, the stability of the update is improved by adding a global search mechanism to balance the distribution of molecules on both sides during seagull updates. To test the competitiveness of the algorithms, the exploration and exploitation capability of the proposed FLAS is validated on 23 benchmark functions, and CEC2020 tests. FLAS is compared with other algorithms in seven engineering optimizations such as a reducer, three-bar truss, gear transmission system, piston rod optimization, gas transmission compressor, pressure vessel, and stepped cone pulley. The experimental results verify that FLAS can effectively optimize conventional engineering optimization problems. Finally, the engineering applicability of the FLAS algorithm is further highlighted by analyzing the results of parameter estimation for the solar PV model.

1. Introduction

Optimization algorithms are always asked to find optimal or near-optimal solutions [1]. Heuristic algorithms and meta-heuristic algorithms (Mas) are concrete implementations of optimization algorithms. Separately, heuristic algorithms optimize the performance of the algorithm by searching the solution space through heuristic rules and empirical guidance [2]. In addition, MAs ensure that the algorithm achieves optimal results by combining and adjusting different heuristic algorithms.
MAs are widely used and essential in science, technology, finance, and medicine. For example, in artificial intelligence (AI), MAs improve the model’s performance and accuracy [3]. It can also be used to optimize problems in AI applications such as intelligent recommendation systems, image processing, and natural language processing to provide better user experience and quality of service. In the financial field, MAs can be used to optimize investment portfolios and risk management to improve investment returns and reduce risks. In the medical field, the MAs can be used to optimize the allocation and scheduling of medical resources to optimize work efficiency in medical services. In transportation, MAs can be used to optimize traffic signal control and route planning to reduce traffic congestion and improve transportation efficiency [4] and so on. In addition, the application of MAs presents some opportunities for algorithmic improvement, multi-objective optimization, and integration with machine learning, but also some challenges, including high complexity, parameter selection, and local optimal solutions.
There are many different types of MAs. This paper mainly classifies them according to animal, plant, discipline, and other types. Animal-based optimization algorithms simulate the social behavior of animals in groups and collectives. For example, the best known is Particle Swarm Optimization (PSO), which simulates specific behaviors of birds during flight or foraging [5]. Krill Herd (KH) [6] simulates the grazing behavior of individual krill. Harris Hawk Optimization (HHO) [7] is mainly inspired by the cooperative behavior and chasing style of Harris hawks in nature. Abdel-Basset et al. proposed a Nutcracker Optimizer (NO) [8]. NO simulates two different behaviors exhibited by Nutcracker at different times. Whale Optimization Algorithm (WOA) [9] was developed, inspired by the feeding process of whales. The Genghis Khan Shark Optimizer (GKSO) simulates [10] predation and survival behaviors.
A plant-based optimization algorithm simulates the intelligent clustering behavior of plants. Dandelion Optimizer (DO) simulates the process of dandelion seeds flying over long distances relying on the wind [11]. Invasive Weed Optimization (IWO) simulates the basic processes of dispersal, growth, reproduction, and competitive extinction of weed seeds in nature [12]. Tree Growth Algorithm (TGA) [13] is an algorithm that simulates the competition among trees for access to light and food.
Discipline-based approaches include primarily chemical, mathematical, and physical approaches. They both accomplish optimization by simulating natural physical, fundamental laws, and chemical phenomena. Physics-based methods, such as those seen in Zhang et al., propose a Growth Optimizer (GO) subject to the learning and reflective mechanisms of individuals during social growth. It mathematically simulates growth behavior [14]. Abdel-Basset and El-Shahat proposed a Young’s Double-Slit Experiment (YDSE) optimizer. The YDSE optimizer was inspired by Young’s double-slit experiment [15]. Special Relativity Search (SRS) simulates the interaction of particles in a magnetic field [16]. Chemistry-based methods, such as Atomic Search Optimization (ASO), mathematically models and simulates the movement of atoms in nature [17]. Nature-inspired chemical reaction optimization algorithms mimic the principles of chemical reactions in nature [18]. Smell Agent Optimization (SAO) considers the relationship between odor agents and objects that vaporize odor molecules [19]. Ray Optimization (RO) [20] is proposed based on the idea of Snell’s light refraction law. Algorithms based on mathematical methods have also been extensively studied, for example, the Arithmetic Optimization Algorithm (AOA) [21] simulates the distributional properties of four basic operators. The Sine Cosine Algorithm (SCA) [22] is an algorithm proposed by the ideas of sine and cosine. Subtraction-Average-Based Optimizer (SABO) [23] proposes an individual updating the position of an individual by subtracting the average of the position idea.
Other types of algorithms include, but are not limited to, optimization algorithms based on human acquisition, e.g., Social Group Optimization (SGO) simulates the social behavior of humans in solving complex problems [24]. Social Evolution and Learning Optimization (SELO) simulates the social learning behavior of humans organized in families in social settings [25]. Student Psychology-Based Optimization (SPBO) simulates the process of students improving their level of proficiency in multiple ways during the learning process [26]. Further, the corresponding algorithms and publication dates are given, as shown in Figure 1.
To increase the broad application of MAs in various fields, researchers constantly update the design and performance of MAs and develop more efficient and accurate algorithms. Fatma A. Hashim [27] proposed a physics-based MA called Fick’s Law Algorithm (FLA) to optimize the process by constantly updating the position of molecules in different motion states under the concentration difference. It has been proven that FLA leads to optimal solutions with high robustness when confronted with real-world engineering optimization problems. However, some experimental studies have also found that FLA suffers from local convergence as well as degradation of convergence accuracy when faced with high-dimensional, high-complexity problems [28]. Therefore, we try to improve the FLA algorithm by introducing some efficient but non-redundant strategies and the proposed multi-strategy improved FLA algorithm (FLAS).
The proposed FLAS adopts many strategies to improve its performance and effect. Specifically, at the beginning of the exploration phase, the FLAS adds differential and Gaussian local mutation strategies to expand the search range in the later iteration. During the transition process phase, FLAS uses intersectional integrated learning strategies to enhance the ability to inquire about the overall situation and randomness. FLAS also adopts the Levy flight strategy in location updates and generates random steps with Levy distribution, which can carry out an extensive range of random searches in the search space. This randomness has an excellent global search ability and effectively controls the degree of the search and convergence rate. Finally, in the exploitation phase, FLAS also adds a global search mechanism for the migration phase of the seagull algorithm, aiming to speed up its convergence by avoiding molecular aggregation. Through the organic combination of these strategies, FLAS can effectively solve the problems of FLA in convergence accuracy and the convergence process and achieve better performance. The competitiveness and validity of FLAS are validated in 23 benchmark functions and CEC2020 tests, 7 engineering design problems, and solar PV parameter estimation applications. The results of FLAS are compared with those of other recent Mas, and are statistically analyzed using the Wilcoxon rank sum test.
The main contributions of this study are as follows:
(1) To overcome the shortcomings of the original algorithm FLA, a new optimization algorithm named FLAS is proposed by introducing the differential variational strategy, Gaussian local variational strategy, interleaving-based integrated learning strategy, and seagull updating strategy.
(2) Some classical and newly proposed algorithms are selected as comparison algorithms, and the optimization ability of FLAS is evaluated on 23 benchmark functions and the CEC2020 test set. The computational results of various performance metrics show that the proposed FLAS has the best overall performance in most of the tested functions.
(3) The FLAS is applied to seven engineering optimizations and the solar PV model parameter estimation, respectively. The results show that FLAS can stably provide the most reliable optimization design strategies for most practical problems.
The remainder of this paper is as follows: first, Section 2 analyzes, summarizes, and improves FLA. Then, in Section 3, the multi-strategy improved Fick algorithm is presented by adding five different strategies. Secondly, we test the performance of FLAS in Section 4. The results of FLAS on the CEC2020 test set are compared with other methods. In Section 5 and Section 6, FLAS is applied to seven practical engineering design problems and the solar PV parameter estimation application. Finally, this study is summarized and prospected.

2. An Overview of the FLA

Fick’s law describes the fundamental principle of diffusion of substances in physics and chemistry. Therefore, Fick’s law algorithm simulates Fick’s law process of substance diffusion [27]. According to Fick’s law, the greater the concentration gradient, the faster the diffusion rate [29]. Therefore, Fick’s algorithm changes the concentration relationship between the two sides by adjusting the position of molecules in different regions to ensure a stable position of the molecules, thus realizing the optimization process. Figure 2 shows the schematic diagram of molecular movement in Fick’s law.
In the FLA algorithm, the optimization process first requires randomly initializing a set of candidate populations (X), whose matrix can be expressed as Equation (1).
X = X 1 , 1 X 1 , 2 X 1 , D X 2 , 1 X 2 , 2 X 2 , D X N , 1 X N , 2 X N , D ,
where N is the population size, and D is the individual dimension. In addition, Xi,j in the matrix represents the jth dimension of the ith molecule. The formula for random initialization is as follows:
X i , : = l b + r a n d ( 1 , d i m ) × ( u b l b ) ,
where ub and lb represent the upper and lower bounds. The rand (1, D) is the random number uniformly generated in the search region. Divide N into two equal-sized subgroups, N1 and N2, and the fitness of the populations of N1 and N2 were calculated, respectively. The molecule constantly moved from high to low concentration, and TFt was a parameter of the iteration function. TFt is described by Equation (3).
T F t = sinh ( t / T ) c 1
where t represents the tth iteration and T represents the maximum number of iterations. sinh serves as a nonlinear transfer function that ensures an efficient transition from exploration to exploitation [27]. Further, the molecular position equation is updated by Equation (4).
X i t = DO T F t < 0.9 EO 0.9 T F t 1 SSO T F t > 1 .
The FLA update process consists of three stages (diffusion operator (DO), equilibrium operator (EO), steady-state operator (SSO)). Through the three ways, FLA can find the best value of the system. We will specifically describe these three parts below.

2.1. Exploration Phase

In the first stage, the diffusion of a molecule from high concentration to low concentration is called DO, as shown in Figure 3. When T F t < 0.9 , due to the different concentration differences between the two regions i and j, the molecule will be transferred from one area to the other by the given parameter T DO t , which can be provided by Equation (5):
T DO t = C 5 × T F t r
where C5 represents a fixed constant with a value of 2, and r means a random number with a value between 0 and 1.
From the parameter T DO t , the flow direction of the molecule is determined by the following formula Equation (6).
X p , i t = from   i   to   j T DO t < r a n d from   j   to   i o t h e r w i s e .
where rand is a random number with a value between 0 and 1. Consider that some molecules move from region i to region j. The formula for the number of molecules that move from region i to region j is as follows:
N T i j N i × r 1 × ( C 4 C 3 ) + N i × C 3
where C3 and C4 represent the fixed constant with values of 0.1 and 0.2, respectively. NT12 and Ntransfer denote the number of molecules flowing at different stages, respectively. r1 is the random number in the interval [0, 1]. The specific formulae are as follows:
N T 12 N 1 × r 1 × ( C 4 C 3 ) + N 1 × C 3
N t r a n s f e r N 2 × r 1 × ( C 4 C 3 ) + N 2 × C 3 .
The molecule NT12 will move to another, and its position will be updated mainly on the best-balanced molecule in the j region using Equation (10).
X p , i t + 1 = X E O , j t + D F p , i t × D O F × r 2 × ( J i , j t × X E O , j t X p , i t + 1 ) ,
where DOF is the change of flow direction with time, J i , j t is the diffusion flux, X E O , i t denotes the balance position of region i. r2 is the random number in the interval [0, 1].
D O F = exp ( C 2 ( T F t r 1 ) ) ,
J i , j t = D d c i , j t d x i , j t ,
d c i , j t = X m , j t X m , i t ,
d x i , j t = ( X E O , j t ) 2 ( X p , i t ) 2 + e p s ,
where C2 represents a fixed constant with a value of 2. X m , j t and X m , i t are the positions of the j and i regions, respectively and eps is a small value. D is the effective diffusion coefficient constant, d c i , j t d x i , j t is the concentration gradient. In addition, NR1 denotes molecules that remain in region i; the molecules in the NR1 are updated in their positions by Equation (15).
N R 1 N 1 N T 12 .
The positions of molecules in the formula are divided into three different strategies and position of region i; their positions do not change.
X p , i t + 1 = X E O , i t , r a n d < 0.8 X E O , i t + D O F × ( r 3 × ( U L ) + L ) , r a n d < 0.9 X p , i t + 1 , o t h e r w i s e
where r3 is the random number in the interval [0, 1]. U and L are the max and min limit.
For molecules in the j region, because the concentration in the j region is higher, the j region, boundary problem is treated with the following Equation (17):
X p , j t + 1 = X E O , j t + D O F × ( r 4 × ( U L ) + L ) ,
where r4 is the random number in the interval [0, 1].

2.2. The Transition Phase from Exploration to Exploitation

In the second phase, for the second stage where the concentration difference is almost zero, the molecule tries to reach an equilibrium state called EO, as shown in Figure 4.
This phase is considered a transition from the exploration phase to the exploitation phase. Molecules update their location by the following Equation (18).
X p , g t + 1 = X E O , p t + Q E O , p t × X p , g t + Q E O , p t × ( M S p , E O t × X E O , g t X p , g t ) ,
where X p , g t and X E O , g t are the positions in group p and g, M S p , E O t is the relative quantity of group g, calculated by the following Equation (25). Q E O , g t is the diffusion rate factor of the group g region, and the calculation formula is as follows:
Q E O , g t = R 1 t × D F g t × D R F E O , g t ,
where D F g t is the direction factor equal to ±1, R 1 t is a random number in the interval [0, 1], and D R F E O , g t represents the diffusion rate.
D R F E O , g t = exp ( J p , E O t / T F t ) ,
J p , E O t = D d c g , E O t d x p , E O t ,
d c g , E O t = X g , E O t X m , g t ,
d x p , E O t = ( X g , E O t ) 2 ( X p , g t ) 2 + e p s ,
D F g t = ± 1   d i r e c t i o n   f a c t o r ,
M S p , E O t = exp ( F S g , E O t F S p , g t + e p s )   m o t i o n   s t e p ,
R 1 t = r a n d [ 0 , 1 ] d d = 1 : D .
where F S g , E O t is the optimum of group g at time t, and F S i , g t is the optimum of molecule p in group g at time t.

2.3. Exploitation Phase

In the third phase, we move the barrier so that the molecule moves to the most stable position to achieve a more stable molecule distribution. In the SSO phase, the molecule updates its position by the following Equation (27).
X p , g t + 1 = X s s t + Q g t × X p , g t + Q g t × ( M S p , g t × X s s t X p , g t ) ,
where X s s t and X p , g t are the position of the stable phase and p molecule, and Q g t and M S p , g t respectively represent the relative quantity of g region and the motion step, the calculation formula is as follows:
Q g t = R 1 t × D F g t × D R F g t ,
D R F g t = exp ( J p , s s t / T F t ) ,
M S p , g t = exp ( F S s s t F S p , g t + e p s ) ,
J p , s s t = D d c g , s s t d x p , s s t ,
d c g , s s t = X m , g t X s s t ,
d x p , s s t = ( X s s t ) 2 ( X p , g t ) 2 + e p s .
From the above overview, the pseudo-code for FLA is obtained in Algorithm 1.
Algorithm 1 FLA Algorithm
Biomimetics 09 00205 i001

3. An Enhanced FLA (FLAS)

FLA can effectively mitigate the imbalance between exploration and utilization. However, it faces the same challenge as other MAs, being that it falls into local optimal solutions. Its main primitive is that the physical simulation search strategy of FLA leads to the inability of the molecules to get out of the local optimal solution. Therefore, to overcome this problem, this study proposes a multi-strategy improved FLA by introducing four effective search strategies, including differential variation, Gaussian local variation, Levy flight, and global search strategies.

3.1. Improvement Strategy

3.1.1. Differential Variation Strategy

Differential variational strategy, as an effective strategy, has the advantages of diversity, efficient searchability, low computing cost, strong robustness, and ease of implementation and understanding [3]. Introducing random perturbations into the population helps algorithms maintain diversity in the search process and avoid the population falling into excessive convergence. At the same time, it can generate a new solution, often different from the parent solutions. Controlling the mutation operator makes more extensive exploration possible in the search space, which helps find potentially better solutions.
In the DO phase, FLA may have locally optimal solutions due to uneven diffusion of molecules when solving the optimization problem. Therefore, in order to better optimize the results, we try to introduce a differential variation strategy to improve the problem that the original algorithm tends to fall into local solutions. First, the weighted position difference between the two individuals is calculated. Further, the obtained result is then added to the position of a random individual to generate the variant individually. The specific formula is as follows:
X 1 n e w = X 1 + F 0 × ( X 1 ( r a n d i ( N T 12 ) , : ) ( X 1 ( r a n d i ( N T 12 ) , : ) ,
where X 1 n e w represents the new position after updating, X 1 represents the different individuals in the tth iteration, and X 1 ( r a n d ( N T 12 ) ) represents the random individual position in the population NT12. In additional, F = 0.3 represents the scaling factor. The adaptive adjustment mutation operator F0 is described as follows:
F 0 = 1 + F u 2 N T 12 2 .
In addition, the differential variation strategy is more adaptable to the constraints of the boundary. By setting appropriate parameters, the amplitude and direction of variation operation can be controlled to ensure that the generated variation solution satisfies the constraint conditions of the problem.

3.1.2. Gauss Local Variation Strategy

In fact, the development phase affects the convergence accuracy of the algorithm. To facilitate the low accuracy problem of FLA, we introduce a variational strategy in the development phase to improve the convergence performance. As a result, the Gaussian local variation can effectively help the algorithm further search for the optimal solution, expanding the search range of FLA in the later iteration stage [11]. The Gaussian local variation is specifically calculated by Equation (36):
X 1 n e w ( i , j ) = X 1 ( i , j ) + D O F × ( ( u b ( j ) l b ( j ) ) × r 3 + l b ( j ) ) × n o r m r n d ( 0 , 1 ) .
where n o r m r n d ( 0 , 1 ) represents a random number between 0 and 1. In the first t iteration, X 1 n e w ( i , j ) denotes the current individual optimum. r3 is the random number in the interval [0, 1].

3.1.3. Integrated Learning Strategies Based on Intersections

In the DO and EO phases, the optimal individuals contribute to the bidirectional search of the whole population. However, the individual optimal values in FLA are not representative, and cannot lead to realizing the global search. Therefore, in order to obtain the most representative optimal individuals, we are inspired by the crossover operator and introduce a crossover-based comprehensive learning (CCL) strategy [12]. The CCL strategy can mutate better individuals through the crossover operator to guide the individual global search. The specific calculation formula is Equation (37):
X 1 n e w = r 1 × X 1 ( i , j ) + ( 1 r 1 ) × X e o 1 + c 1 × ( X 1 ( i , j ) X e o 1 ) ,
where c 1 is a randomly generated number evenly distributed among [−1, 1], and both r 1 and r 2 are randomly numbers evenly distributed among [0, 1].

3.1.4. Levy Flight Strategy

For better convergence and global optimization, the proposed FLAS employs Levy flight for position update of subgroup N2 [30]. After the FLA updates the position, a Levy flight is performed to update the individual position. Levy flight strategy is designed to simulate the stochastic and exploratory nature of Levy flights. It can perform a global search in the search space to achieve the desired result of jumping out of the local optimum. The specific calculation formula is described as Equation (38):
X 1 n e w = a l f a × l e v y r a n d ( b e t a ) × X 1 ,
where levyrand means the Levy distribution [30]. In addition, alfa means the levy scale parameter, and the value of alfa is 0.05 + 0.04 × rand and b e t a = 2 / 3 .

3.1.5. Global Search Strategy of Gull Algorithm during Migration Phase

In the stable phase (SSO), due to the local development of the algorithm, a large number of molecules may gather around the current environment, which restricts the development of the FLA, thus causing the FLA to be unable to break through its intrinsic limitations. At this time, the global search strategy of the seagull algorithm [31] is added to accelerate the convergence rate of FLA and avoid the collision between molecules in the process of motion [32]. D_alphs indicates that the new position of molecules has no conflict. The update process is shown in Figure 5, and the formula is as follows:
D _ a l p h s = F c × X 1 ( i , : ) + A 1 × ( X s s X 1 ( i , : ) ) ,
where F c = 2 sin ( I ) × 2 T is the control factor,  I = [ 0 , 0.8 π ] A 1 = 2 × F c × r 1 F c .
In the seagull optimization algorithm, the moving direction of each seagull individual, is calculated as the position. The moving distance is adjusted based on the fitness. The higher the fitness, the smaller the moving distance. Therefore, this strategy is employed to update the molecule positions during the stable phase. The specific position update formula is as follows
X e = D _ a l p h s e I I × cos ( I I 2 π ) ,
where Xe is a new position that is reached by moving in the direction of the optimal position. I I = ( F c 1 ) × r a n d + 1 represents a random number that balances global and local searches.

3.2. The Improved FLA Steps

The FLAS improved algorithm is established by introducing the differential variation strategy, Gaussian local variation strategy, interleave-based comprehensive learning strategy, Levy flight strategy, and the global search strategy in the migration phase of the Gull algorithm. The FLAS algorithm process is as follows:
When TFt < 0.9 and T D O t < r a n d , the position of NT12 obtained by Equation (10) in N1 population is updated by Equation (34).
X 1 n e w ( i ,   ) = X 1 n e w ( i ,   ) + F 0 × ( X 1 ( R a n d i ( N T 12 ) , : ) ( X 1 ( R a n d i ( N T 12 ) , : ) ) .
At the same time, for the remaining individuals (N1NT12) and the population individual position update, the strategies in Equations (36) and (37) are adopted. The formula is as follows:
r a n d < 0.8 ,   X 1 n e w ( i , j ) = X e o 1 ( j ) ,
r a n d < 0.9 , X 1 n e w ( i , j ) = X 1 ( i , j ) + D O F × ( ( u b ( j ) l b ( j ) ) × r 3 + l b ( j ) × n o r m r n d ( 0 , 1 ) ,
r a n d > 0.9 , X 1 n e w ( i , j ) = r 1 × X 1 ( i , j ) + ( 1 r 1 ) × X e o 1 + c 1 × X 1 ( i , j ) X e o 1 ( j ) ) .
In the N2 population, individual location updating adopts the strategies in Equations (36) and (38).
X 2 n e w ( i , : ) = X e o 2 + D O F × ( ( u b l b ) × r 4 + l b ) × n o r m r n d ( 0 , 1 ) ,
X 2 n e w ( i , : ) = a l f a × l e v y r a n d ( b e t a ) × X 2 n e w ( i , : ) .
when TtDO < rand, the position update of Ntransfer obtained with Equation (10) in N2 population is:
X 2 n e w ( i , : ) = X e o 1 + D F g × D O F × r a n d ( 1 , d i m ) × ( J × X e o 1 X 2 ( i , : ) ) .
For the remaining individuals (N2Ntransfer), the same strategies in Equations (36) and (37) were adopted as the individual renewal mode of N1 population in TtDO < rand phase.
r a n d < 0.8 , X 2 n e w ( i , j ) = X e o 2 ( j ) ,
r a n d < 0.9 , X 2 n e w ( i , j ) = X 2 ( i , j ) + D O F × ( ( u b ( j ) l b ( j ) ) × r 3 + l b ( j ) × n o r m r n d ( 0 , 1 ) ,
r a n d > 0.9 , X 2 n e w ( i , j ) = r 1 × X 2 ( i , j ) + ( 1 r 1 ) × X e o 2 + c 1 × X 2 ( i , j ) X e o 2 ( j ) ) .
In this phase, the same strategies Equations (36) and (38) are used to update the positions of N1 population in the same phase as those of N2 population in TtDO < rand phase. The formula is as follows:
X 1 n e w ( i , : ) = X e o 1 + D O F × ( ( u b l b ) × r 4 + l b ) × n o r m r n d ( 0 , 1 ) ,
X 1 n e w ( i , : ) = a l f a × l e v y r a n d ( b e t a ) × X 1 n e w ( i , : ) .
When TFt ≤ 1, the EO phase will be entered, and the strategy Equation (36) will be adopted for both N1 population individual location renewal and N2 population individual location renewal. The formula is as follows:
X 1 n e w ( i , : ) = ( X e o 1 + Q e o × X 1 ( i , : ) + Q e o × ( M S × X e o 1 X 1 ( i , : ) × n o r m r n d ( 0 , 1 ) ,
X 1 n e w ( i , : ) = ( X e o 2 + Q e o × X 2 ( i , : ) + Q e o × ( M S × X e o 2 X 1 ( i , : ) × n o r m r n d ( 0 , 1 ) .
When TFt > 1, entering the SSO phase, Equations (55) and (56) were adopted for individual location updates of the N1 population and N2 population, and the formula was as follows:
X 1 n e w ( i , : ) = X s s 1 + Q g × X e + Q g × ( M S × X s s 1 X e ) ,
X 2 n e w ( i , : ) = X s s 2 + Q g × X e + Q g × ( M S × X s s 2 X e ) .
The basic steps of the FLAS:
Step 1. Specify both N and T, divide N into two equal small populations N1 and N2, randomly generate the initial positions of N1 and N2 individuals in the solution search space, and set the current iteration number t = 1.
Step 2. The fitness values of N1 and N2 individuals were calculated, and the optimal individual position was obtained.
Step 3. When TFt < 0.9, T D O t r a n d and N1 population enters the exploration phase (DO), and Equation (41) is used to update individual locations for NT12.
Step 4. When T D O t r a n d , for the remaining population of individuals in N1, if rand < 0.8, the individual location is updated using Equation (43); if 0.8 < rand < 0.9, the individual location is updated using Equation (44); Others, using Equation (45) to update individual position.
Step 5. When T D O t r a n d , in the population of individuals in N2, the individual locations were updated using Equation (46).
Step 6. In the case of T D O t < r a n d , in the N2 population, individual positions are updated using Equation (46) for Ntransfer. For the remaining individuals (N2Ntransfer), if rand < 0.8, the individual position is updated using Equation (46); if 0.8 < rand < 0.9, the individual location is updated using Equation (49); otherwise, the individual location is updated using Equation (50).
Step 7. When T D O t r a n d , individual positions are updated using Equation (52) for N1 populations.
Step 8. When TFt ≤ 1, individual positions are updated using Equation (53) for N1 populations and individual positions are updated using Equation (54) for N2 populations.
Step 9. When TFt > 1, individual positions of population N1 are updated using Equation (55), and those in populations of N2 are updated using Equation (56).
Step 10. The boundary treatment of population location is carried out.
Step 11. Output the position and fitness values of the globally optimal individual.
The pseudo-code of FLAS is shown in Algorithm 2.
Algorithm 2 FLAS Algorithm
Biomimetics 09 00205 i002

3.3. Time Complexity of the FLAS

MAs time complexity is influenced by both the dimensionality of variables D, population N, and iteration T. Determining the time complexity (TC) of an algorithm helps evaluate its operational efficiency. In the FLAS algorithm, first of all, the TC required during the initialization phase is O(N × D). FLAS then entered an iterative search for an updated solution. Entering the exploration phase. When the T D O t < r a n d , TC is O(NT12 × D) + O((N/2 − NT12) × D) + O(N/2 × D), when the T D O t r a n d , TC is O(Ntransfer × D) + O((N/2 − Ntransfer) × D) + O(N/2 × D). The TC entering the EXE phase are O(N/2 × D) + O(N/2 × D). Finally, the TC for the exploitation phase is O(N/2 × D) + O(N/2 × D). The total TC of FLAS is calculated:
O ( F L A S ) = O ( N × D ) + O ( T × O ( N T 12 × D ) + O ( ( N / 2 N T 12 ) × D ) +   O ( N t r a n s f e r × D ) + O ( ( N / 2 N t r a n s f e r ) × D ) +   O ( 6 × N / 2 × D ) ) = O ( N × D + ( T × 8 × N / 2 × D ) ) = O ( N × D × ( 1 + 4 × T ) ) .
For a clearer description of the situation and the steps of FLAS to solve the optimization model, Figure 6 shows the algorithm solution flow chart.

4. Experimental Results

To verify the effectiveness of the FLAS method, the 23 benchmark functions and CEC2020 test sets are used to examine the optimization capability, and several algorithms have been selected for comparison. The selected comparison algorithms include the differential evolution algorithm (DE) [33], the improved molecule swarm optimization algorithm (PSO_ELPM) [34], the spectral optimization algorithm LSO [35] inspired by physics and mathematics, the arithmetic optimization algorithm AOA [36], the Harris Eagle algorithm (HHO) [7] inspired by animal nature, and the improved Golden Jackal optimization (IGJO) [37] and improved Grey Wolf algorithm (IGWO) [38]. The parameter values of the above 10 MAs are depicted in Table 1.
To accurately analyze the performance of FLAS, this article will use the following six performance metrics:
  • B e s t = min { b e s t 1 , b e s t 2 , , b e s t m } ;
  • W o r s t = max { b e s t 1 , b e s t 2 , , b e s t m } ;
  • M e a n = 1 m i = 1 m b e s t i ;
  • s t d = 1 m 1 i = 1 m ( b e s t i M e a n ) 2 ;
where besti represents the optimal result for the ith run, m implies the number of runs.
  • Rank: All algorithms are ranked according to the quality of their performance indicators. The sequence number is the corresponding Rank. If the specific values of the comparison algorithms are equal, they are recorded as having the same Rank.
  • Wilcoxon rank sum test: We estimate whether a noticeable disparity exists between the two algorithms. Calculate whether the two arrays of fitness values after m runs come from a continuous distribution with the same median. p-values derived by the Wilcoxon rank sum test for the other nine algorithms are shown at the α = 0.05 level. Bolded data show insignificant differences between the comparative algorithms and FLAS calculations. The “=” symbol indicates the number where there is no distinct difference between the results of other MAs and FLAS; the “+” symbol represents the number that outperformed FLAS, while the “−” symbol represents the number of functions that have inferior results compared to FLAS.

4.1. Parameter Analysis of Levy Scale Alfa

For the update strategy based on Levy flights, the parameter alfa affects the improvement of the algorithm performance with Levy flights. Due to the kinematic nature of each Levy flight, the random numbers it generates can only guide the overall convergence. The value of the parameter alfa affects the search range of the molecular neighborhood region. The larger the value of alfa, the larger the search range of the molecular neighborhood region and the greater the tendency of the algorithm to converge. If the value of alfa is too small, the search range of the molecular neighborhood region is small, and the Levy flight has less influence on the search process, which reduces the search capability and accuracy. Therefore, an appropriate value of alfa can improve the exploration ability of FLAS in individual molecular neighborhood locations.
In order to find an appropriate alfa value, the effects of different alfa values on the convergence performance of the FLAS algorithm are discussed. Ten test functions from the cec2020 test suite are selected for numerical experiments. Ten correlation parameters were obtained at intervals of 0.04 in the interval [0.01, 0.5] (0.01 + rand × 0.04, 0.05 + rand × 0.04, 0.1 + rand × 0.04, 0.15 + rand × 0.04, 0.2 + rand × 0.04, 0.25 + rand × 0.04, 0.3 + rand × 0.04, 0.35 + rand × 0.04, 0.4 + rand × 0.04 and 0.45 + rand × 0.04). For each given value of the ten parameters, the average values obtained using FALS over 20 independent runs of the test are shown in Table 2. The maximum iteration is 1000 and the population size is 30.
From the results in Table 2, it can be seen that the accuracy of the FLAS solution is higher when the value of α is taken as 0.05 + 0.04 × rand. The reason for this result is that smaller alfa allows FLAS to increase the distance connection between the population and the most available position, which enhances the local development ability of FLAS. Smaller values of alfa give better experimental results when faced with more complex mixing and combining functions. This result suggests that smaller alfa values increase the solution space search in the presence of more comprehensive functions and help the Levy flight strategy to better utilize its ability to jump out of local solutions. Thus, smaller values of alfa contribute to the local search ability of FLAS when dealing with complex functions. The last part of the table lists the rankings for different values of alfa. It can be noticed that FLAS performs well for all functions when alfa is 0.05 + 0.04 × rand.

4.2. Qualitative Analysis of FLAS

EXE are two important concepts in MAs. Exploration refers to looking for new solutions or improving existing solutions. Exploitation refers to the optimization and utilization of existing solutions in the hope of getting better results. To infer whether a group is currently more inclined to explore or exploit, it can be judged by counting the differences between individuals. If the differences between individuals are large, it indicates that the group is currently more inclined to explore; if the differences between individuals are small, it indicates that the group is currently more inclined to develop [39]. How the algorithm balances these two capabilities will be the key to determining optimal performance. Therefore, calculate the proportional formula of the two capabilities in the iterative process, as shown in Equation (58):
D i v e r s i t y j = 1 N i = 1 N m e d i a n ( x i j ) x i j ,
D i v e r s i t y = 1 D j = 1 D D i v e r s i t y j .
where median (j) represents the median value of the J-dimensional variable among all individuals in the population (N). After taking the median dimension distance of dimension j of each individual i in the N, the average Diversity is obtained for all individuals in turn. Then, the average value of Diversity in each dimension is calculated to obtain Diversity. The percentage of EXE of the population in each iteration formula are Equations (59) and (60).
E x p l o r a t i o n ( % ) = D i v e r s i t y D i v e r s i t y max × 100
E x p l o i t a t i o n ( % ) = | D i v e r s i t y ( t ) D i v e r s i t y max | D i v e r s i t y max × 100
Firstly, the EXE ability of FLAS is tested on 23 benchmark functions, and the single-peak test function (F1–F7) is related to the capability of finding the optimal solution. The multimodal function (F8–F13) can test the MA’s capability to explore and escape local optima values because there are many local minima and the EXE ability of the fixed-dimensional multimodal function (F14–F23). The EXE diagram of FLAS is displayed in Figure 6. In the EXE diagram, Exploration is represented by the red area, and the blue area represents exploitation. The final exploitation should be close to 100% because the population gradually approaches optimum in the late phase of evolution, and the whole solution set is concentrated near optimum. In addition, the percentage of EXE should alternate as the iteration progresses: the blue area should go up and eventually approach 1; the red area is going to come down and eventually approach 0. Figure 7 shows that FLAS converges very fast on other functions except F8, F17, and F20, and the population soon approaches the optimal solution. According to the latest research [40], when EXE accounts for 10% and 90%, respectively, in the search process, the algorithm has the best performance. Therefore, FLAS meets this requirement; the Exploration of the population finally accounts for about 90%, and the exploitation accounts for about 10%.
To enhance the stability and reliability of the outcomes, give the ability to the EXE. FLAS continued testing on CEC2022. The EXE diagram on CEC2022 is shown in Figure 8, from which the FLAS demonstrates its ability to quickly find globally optimal solutions and flexibly switch between unimodal, fundamental, and combinatorial functions. When dealing with complex mixed functions, FLAS shows high exploration ability in the later iteration, and avoids the dilemma of local optimal solutions. FLAS has adopted a strategy to extend the time required to transition to the exploitation phase, effectively improving optimization capabilities.

4.3. Comparison of FLAS and Other MAs on 23 Benchmark Functions

In order to validate the effectiveness of the proposed algorithm for FLAS, comparison experiments with PSO_ELPM, HHO, IGWO, DE, LSO, BWO, AOA, and IGJO algorithms are conducted in 23 benchmark functions. All comparison algorithms are run 30 times. In addition, N is chosen to be 500 for all algorithms. The iterative plots and boxplots of FLAS and the other comparison algorithms for the 23 benchmark functions are given in Figure 9 and Figure 10, respectively. In addition, the results of the comparison experiments for the four metrics are given in Table 3. In order to statistically validate the effectiveness of the proposed algorithms, the statistical results of the Wilcoxon rank sum test for FLAS and other methods are given in Table 4.
As illustrated in Figure 9, the FLAS demonstrates noticeable superiority over other MAs, particularly from F1 to F7. For the multimodal test function, F8–F13, the FLAS outperforms other MAs in convergence speed and accuracy. For low-dimensional multi-peak test functions (F14–F23), FLAS is better than other MAs, particularly on F14, F15, and F20. Among the remaining functions, the optimum of 23 benchmark functions can be obtained accurately. For the F14–F23 function, we can see that FLAS can quickly transition between the early search and the late transition phase, converging near the optimal position at the beginning of the iteration. Then, FLAS progressively determines the optimal position and updates the answer to validate the previous observations. We can find that FLAS performs quite competitively across the three types of functions and maintains a consistent dominance in most of them. In addition, convincing results show that the FLAS algorithm is also able to balance both exploratory and developmental search. To infer information from Figure 10, the FLAS obtains lower and narrower boxes in most functions. The finding that in most cases, the distribution of the objectives of FLAS is centered on the other intelligent algorithms also illustrates the consistency and stability of FLAS. According to the comparison results between Table 3 and various algorithms, FLAS is generally the first. In addition, the proposed FLAS is ranked first for 15 of the test functions and is ranked second for the remaining 5 test functions. For the single-peak functions F1–F7, FLAS is ranked at least second in all of them, indicating that the proposed FLAS can find the single-peak optimal solution effectively. In addition, FLAS ranks first in all multi-peak functions, indicating that FLAS can effectively avoid the interference of localized solutions. Competitive performance is also demonstrated in complex problems such as fixed dimension. Considering the contingency of the test results, we further analyze the experimental results from the perspective of statistical tests. From Table 4, only two functions are better than the algorithm. FLAS is obviously superior to PSO_ELPM, HHO, DE, LSO, FLA, BWO, IGJO, AOA, and IGWO algorithms. Therefore, in the reference function, the FLAS converges significantly better. From the experimental results, it can be found that the proposed algorithm can effectively solve the single-peak as well as multi-peak optimization problems, and obtain better optimization results. But it can and will have a longer running time.

4.4. Comparison between FLAS and Other MAs on CEC2020

FLAS and other MAs are analyzed on the CEC2020 test set. The value has 20 dimensions and runs 30 times independently. As shown in Table 5, FLAS ranks first in F1, F4-F5, and F10. Among the ten algorithms, FLAS was in the top three on 90% of the tested functions. In Figure 11, FLAS converges the fastest of all functions except F6. Figure 12 shows a boxplot of MAs and data distribution for MAs. Figure 13 shows the radar plot of MAs. According to the data listed in Table 6, among the nine groups of comparison algorithms, only one function of the PSO_ELPM algorithm is superior to FLAS, and the others are inferior to FLAS.

4.5. Comparison of FLAS and DIRECT in CEC2020

In order to further validate the convergence and optimization performance of the proposed FLAS in the face of complex optimization problems, this section conducts experiments comparing FLAS with the well-known deterministic optimization method, DIRECT [41], in the CEC2020 test function. To ensure the validity of the experiments, both FLAS and DIRECT perform 10,000 function evaluation times [42]. Meanwhile, the population of FLAS is set to 30. for the CEC2020 test function, the dimension of all ten functions is set to 10. Figure 14 gives the operational zones built using 30 runs performed by the FLAS method and operational characteristics for the DIRECT method on ten CEC2020 functions. The upper and the lower boundaries of the zone are shown as dark blue curves [43].
As can be seen from the experimental result plots, the algorithmic performance of the proposed FLAS is competitive with the deterministic methods. FLAS demonstrates absolute optimization advantages in cec02, cec05, cec06, and cec07. The worst results of FLAS are comparable to or even better than those of DIRECT. For cec01, cec08, cec09, and cec10 test functions, the results of DIRECT optimize the average results of FLAS and are comparable to the best results of FLAS. In contrast, the optimal results for FLAS tend to be optimized better, indicating that FLAS has a higher upper bound. For cec04, both FLAS and DIRECT obtained optimal results. Only on cec03 did DIRECT obtain better results than FLAS. Thus, FLAS can also obtain better optimization results than deterministic methods.
In addition, in order to effectively demonstrate the convergence situation of FLAS, we give plots of the convergence results of the ten test functions of CEC2020 for different numbers of function evaluations. Considering that the convergence speed of FLAS varies for different test functions, the number of function evaluations is set to 10,000 for the cec01–cec04, cec06, and cec08–cec10 test functions. For the cec05 and cec07 test functions, the number of function evaluations is set to 30,000. The graphs of the convergence of FLAS are given in Figure 15. In order to minimize the negative impact of the resultant magnitude on the presentation of the convergence plot results, only the convergence results for cec01 are given in Figure 15a.
From the results of the convergence plots, it can be found that FLAS guarantees effective convergence for different test functions.

5. Engineering Examples

In order to further evaluate the ability of FLAS to solve real-world applications, FLAS is compared with other well-performing MAs on many different engineering design problems. Two types of optimization problems are selected: (1) Box-constrained problems have only unique constraints on the upper and lower bounds of the variables, including gear transmission system design [44], and gas transmission compressor design [45]. (2) General-constrained problems have more complex constraints and include reducer design [46], three-bar truss design [47], piston rod optimization design [48], pressure vessel design [49], and stepped cone pulley problem [50]. We will conduct a comprehensive test and evaluation of FLAS through these specific engineering design questions to understand its performance and feasibility in practical applications [51]. This approach can help us better understand the advantages and limitations of FLAS and provide powerful solutions to optimization problems in the engineering field [52,53].

5.1. Reducer Design Problems

Designing a reducer that meets specific requirements to achieve a speed ratio between a given input speed and an output speed achieves optimum [54]. In Figure 16, the reducer design problem has seven decision variables, namely, surface width ( x a 1 ), gear module ( x a 2 ), number of pinion teeth ( x a 3 ), bearing between the first axial length ( x a 4 ) and the second axis length ( x a 5 ), the first axis diameter ( x a 6 ), and the second axis diameter ( x a 7 ). From the figure, it can be understood that the problem requires optimization of the design weight under the constraints of bending stresses, surface stresses, transverse deflections, and axial stresses in the gear teeth. According to the definition of a gearbox, it is more complex than many practical applications because it has more constraints. The objective equation with respect to x = [ x a 1 , x a 2 , x a 3 , x a 4 , x a 5 , x a 6 , x a 7 ] and 11 constraints are listed below.
min   f ( x a ) = 0.7854 x a 1 x a 2 2 ( 3.3333 x a 3 2 + 14.9334 x a 3 43.0934 )     1.508 x a 1 ( x a 6 2 + x a 7 2 ) + 7.4777 ( x a 6 3 + x a 7 3 ) + 0.7854 ( x a 4 x a 6 2 + x a 5 x a 7 2 ) ,
Variable values range from:
2.6 x a 1 3.6 , 0.7 x a 2 0.8 , 17 x a 3 28 , 7.3 x a 4 8.3 , 7.8 x a 5 8.3 , 2.9 x a 6 3.9 , 5.0 x a 7 5.5 .
The constraint conditions are:
g 1 ( x a ) = 27 x a 1 x a 2 2 x a 3 1 0 , g 2 ( x a ) = 397.5 x a 1 x a 2 2 x a 3 2 1 0 , g 3 ( x a ) = 1.93 x a 4 3 x a 2 x a 3 x a 6 4 1 0 , g 4 ( x a ) = 1.93 x a 5 3 x a 2 x a 3 x a 7 4 1 0 , g 5 ( x a ) = ( 745 x a 4 / x a 2 x a 3 ) 2 + 16.9 × 10 6 110 x a 6 3 1 0 , g 6 ( x a ) = ( 745 x a 5 / x a 2 x a 3 ) 2 + 157.5 × 10 6 85 x a 7 3 1 0 , g 7 ( x a ) = x a 2 x a 3 40 1 0 , g 8 ( x a ) = 5 x a 2 x a 1 1 0 , g 9 ( x a ) = x a 1 12 x a 2 1 0 , g 10 ( x a ) = 1.56 x a 6 + 1.9 x a 4 1 0 , g 11 ( x a ) = 1.1 x a 7 + 1.9 x a 5 1 0 ,
By combining the FLAS with PSO_ELPM, HHO, DE, FLA, Beluga Whale Optimization algorithm (BWO) [55], Rat colony Optimizer (RSO) [56], Reptile search algorithm (RSA) [57], and the tunica swarm algorithm (TSA) [46], eight algorithms were compared. Table 7 shows the minimum total weight obtained using FLAS and other MAs. Moreover, the total weight obtained using FLAS is the smallest, which is 2997.0. The total weight obtained using RSO is larger, and its value is 8.76E+07. As shown in Table 8, the FLAS algorithm is the best in all metrics; their standard deviation is only 5.7526, which indicates that FLAS has a more accurate and stable result.

5.2. Three-Bar Truss Design

The objective of the three-bar truss design problem is to manipulate two parameters (xA1, xA2) to minimize the weight of the truss. The problem has three constraints: stress (σ), deflection, and buckling. A schematic diagram of the three-rod truss design problem is shown in Figure 17. Therefore, the mathematical model for the design of the three-bar truss is shown below:
Functions as follows:
min f ( X ) = ( 2 2 x A 1 + x A 2 ) l s . t . g 1 ( X ) = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 , g 2 ( X ) = x A 2 2 x A 1 2 + 2 x A 1 x A 2 P σ 0 , g 3 ( X ) = 1 2 x A 2 + x A 1 P σ 0 ,
where 0 x A 1 , x A 2 1 . In addition, l = 100 cm , P = 2 KN / cm 2 , σ = 2 KN / cm 2 .
Table 9 shows the HHO has the smallest function value, followed by the FLAS. However, according to Table 10, FLAS has obtained the best mean value. In addition to the slightly higher standard deviation, FLAS still has the first-solving effect.

5.3. Design Problems of Gear Group

The number of teeth on the looking for the best to minimize the cost of gear ratios is the purpose of the gear set design problem [58], as shown in Figure 18. The problem is an integer unconstrained optimization problem with four design variables. These design variables denote the number of teeth of the gears and are denoted by T A , T B , T C , T D , respectively [59,60].
Let X = [ x 1 , x 2 , x 3 , x 4 ] = [ T A , T B , T C , T D ] , and the following mathematical model is obtained:
min f ( X ) = ( 1 6.931 x 1 x 2 x 3 x 4 ) 2 ,
where 12 x 1 , x 2 , x 3 , x 4 60 .
The proposed FLAS is compared with others on this problem. These include improved molecule swarm optimization (PSO_ELPM), Harris Eagle Optimizer (HHO), Differential Evolution Algorithm (DE), (SCA) [22] Fick algorithm (FLA), Beluga Whale Optimization Algorithm (BWO), Mouse Population Optimizer (RSO), Slime mold algorithm (SMA) [61], and improved Gray Wolf Algorithm (IGWO). In Table 11, except for DE, FLA, RSO, and SMA, the other algorithms all obtain minimum value, which indicates that MAs have the same effect on solving the extreme value of this problem. In Table 12, the reference index calculated using the FLAS algorithm is smaller. FLAS has the best solution effect and relatively stable results among these MAs.

5.4. Piston Rod Optimization Design

The primary purpose of this engineering design is to guarantee that the oil volume is minimized during the lifting of the piston from 0° to 45° by positioning four different piston components (H (w1), B (w2), D (w3), and X (w4)). Figure 19 is a schematic of the piston rod [62], a mathematical model is developed as follows:
min   f ( w ) = 1 4 π w 3 2 ( L 2 L 1 ) , s . t .   g 1 ( w ) = Q L cos θ R × F 0 , g 2 ( w ) = Q ( L w 4 ) M max 0 , g 3 ( w ) = 1.2 ( L 2 L 1 ) L 1 0 , g 4 ( w ) = w 3 2 w 2 0 ,
where R = w 4 ( w 4 sin θ + w 1 ) + w 1 ( w 2 w 4 cos θ ) ( w 4 w 2 ) 2 + w 1 2 ,   F = π P w 3 2 4 ,   L 1 = ( w 4 w 2 ) 2 + w 1 2 , and L 2 = ( w 4 sin θ + w 1 ) 2 + ( w 2 w 4 cos θ ) 2 .
The variables w1, w2 and w3 fluctuate within [0.05, 500], and w4 belongs to [0.05, 120].
Table 13 and Table 14 compare FLAS with PSO_ELPM, HHO, DE, SCA, FLA, BWO, RSO, SMA, and IGWO, respectively. In Table 13, FLAS obtains the lowest cost. Table 14 shows that the results of FLAS are slightly better than those of other MAs. In addition, FLAS also obtained a small difference between the results obtained by 30 runs, which indicates that FLAS has better robustness while achieving optimal results.

5.5. Design of Gas Transmission Compressor

In the minimization cost model, total costs need to be kept to a minimum, and D, pl, ps, L, η(η = pl/ps) are the correlation coefficient [3], as shown in Figure 20. Let m = [ m 1 , m 2 , m 3 ] = [ L , λ , D ] , and the following mathematical model is established:
min f ( m ) = 3.69 × 10 4 m 3 + 7.72 × 10 8 m 1 1 m 2 0.219 765.43 × 10 6 × m 1 1 +   8.61 × 10 5 × m 1 1 2 ( m 2 2 1 ) 1 2 m 3 2 3 , m 1 , m 2 , m 3 > 0 , 10 m 1 55 , 1.1 m 2 2 , 10 m 3 40 .
The FLAS is compared with nine other MAs, namely FLAS with PSO_ELPM, HHO, DE, SCA, FLA, BWO, RSO, SMA, and IGWO. The solution results are shown in Table 15. By comparing the data of each algorithm in Table 16, FLAS is the best, which is 2,964,375.810043. Although the standard deviation of FLAS is not the smallest, the worst value and the mean value are also relatively small, which indicates that FLAS is more accurate than other comparison MAs.

5.6. Pressure Vessel Design Problems (PVD)

The ultimate requirement for pressure vessel design is to minimize the cost of fabrication, welding, and materials for the pressure vessel [63]. Figure 21 prompts us that a hemispherical head capped the cylindrical vessel at both ends. Four relevant design variables need to be considered for optimization, including shell thickness Ts, head thickness Th, internal diameter R, and vessel cylindrical cross-section length L. Let E = [ e 1 , e 2 , e 3 , e 4 ] = [ T s , T h , R , L ] . The mathematical optimization model for pressure vessel design is as follows:
min f ( E ) = 0.6224 e 1 e 3 e 4 + 1.7781 e 2 e 3 2 + 3.1661 e 1 2 e 4 + 19.84 e 1 2 e 3 s . t . g 1 ( E ) = e 1 + 0.0193 e 3 0 , g 2 ( E ) = e 2 + 0.00954 e 3 0 , g 3 ( Ε ) = π e 3 2 e 4 4 3 π e 3 3 + 1,296,000 0 , g 4 ( E ) = e 4 240 0 ,
where 0 e 1 99 , 0 e 2 99 , 10 e 3 200 , 10 e 4 200 .
Therefore, FLAS is compared with PSO_ELPM, HHO, DE, SCA, FLA, BWO, RSO, SMA, and IGWO. From the data in Table 17 and Table 18, it is clear that the FLAS algorithm works best among the 10 MAs, which indicates that FLAS has good applicability in this problem.

5.7. Step Cone Pulley Problem

The goal of the step-cone pulley design problem is to design a four-step-cone pulley with minimum weight using five design variables, consisting of four design variables for the diameters of each step, with the fifth being the width of the pulley. Figure 22 displays the Step cone pulley problem. In this case, it is hypothesized that the tapered pulleys and the belt are of the same width. There are a total of 11 constraints, 3 of which are equality constraints and the rest are inequality constraints. The conditions are to make sure that the belt lengths, tension ratios, and power transmitted by the belt are the same in all steps. The design power of the stepped pulley is at least 0.75 hp (0.75 × 745.6998 W) with an input speed of 350 rpm and output speeds of 750, 450, 250, and 150 rpm, respectively [21]. The mathematical expression for the weight of the four-stage travel cone pulley that can be optimized is as follows:
min f ( w ) = ρ w 5 [ w 1 2 { 1 + ( M 1 M ) 2 } + w 2 2 { 1 + ( M 2 M ) 2 } + w 3 2 { 1 + ( M 3 M ) 2 } + w 4 2 { 1 + ( M 4 M ) 2 } ] , s . t . h 1 ( w ) = C b 1 C b 2 = 0 ,   h 2 ( w ) = C b 1 C b 3 = 0 ,   h 3 ( w ) = C b 1 C b 4 = 0 , Q i ( w ) = R i 2   ( i = 1 , 2 , 3 , 4 ) , Q i ( w ) = P i ( 0.75 × 745.6998 )   ( i = 5 , 6 , 7 , 8 ) ,
where ρ = 7200   kg / m 3 , a = 3   m , μ = 0.35 , s = 1.75   MPa , t = 8   mm , the mathematical expressions of Cbi, Pi and Ri are, respectively:
C b i = π d i 2 ( 1 + N i N ) + ( N i N 1 ) 2 4 a + 2 a , P i = s t ω   [ 1 exp [ μ { π 2 sin 1 { ( N i N 1 ) d i 2 a } } ] ] π d i N i 60 , R i = exp [ μ { π 2 sin 1 { ( N i N 1 ) d i 2 a } } ] ,   ( i = 1 , 2 , 3 , 4 )
where, t = 8 mm, s = 1.75 MPa, u = 0.35, =7200 kg/m3, and a = 3 mm.
The FLAS is compared with PSO_ELPM, HHO, DE, SCA, FLA, BWO, RSO, SMA, and IGWO in solving the stepped cone pulley problem. Table 19 and Table 20 show that the optimal value of the FLAS algorithm is the closest to the real value among the 10 MAs, which fully indicates that FLAS works best.

6. Parameter Estimation of Solar Photovoltaic Model

The high-precision estimation of solar PV parameters is an urgent problem in power systems and the key to improving the output of power systems. In this section, parameters of the single diode model (SDM) in the photovoltaic (PV) model [64] are optimized by using to explore its effectiveness further. In the experiment, two sets of data [65] from the RTC France PV module and the Photowatt-PWP201 PV module were used to estimate the parameters of the SDM model of the RTC France PV module; the unknown parameters of the Photowatt-PWP201 PV module were estimated.
Because of its simplicity, SDM has wide applications in simulating the performance of solar photovoltaic systems, and its structure is shown in Figure 23.
The relationship between the PV current, diode current and resistance current in the circuit and the output current is expressed as follows:
I O A = I P V A I d A I s h A ,
I d A = I R s A [ exp ( e ( U O V + U I s Ω I O A ) n l T K ) 1 ] ,
I s h A = U O V + R I s Ω I O A R I p Ω .
where I O A is the output current of SDM, I P V A is the photovoltaic current, I R s A represents the reverse saturation current of SDM, U O V is the output voltage, R I s Ω and R I p Ω warm are the series resistance and parallel resistance of SDM, respectively, e is the electronic charge, usually 1.60217646 × 10-19C, n is the ideal factor of a single diode, I is the Boltzmann constant with a value of 1.3806503 × 10−23 J/K, and Tk is the Kelvin temperature [66]. In Formula (68), since I, e and Tk are all fixed constants, there are five parameters to be optimized in SDM: I P V A ,   I R s Ω ,   I I p Ω ,   I I s A and n, constituting a decision variable of XSDM = [ I P V A , I R s Ω , I I p Ω , I I s A , n ] parameter optimization problem.

PV Parameter Optimization Model and Experimental Setup

In a photovoltaic system, the output voltage U O V and output current I O A are the actual data measured in the experiment. The goal of the model is to find the values of unknown parameters [67] through MAs to minimize error. Jiao et al. [68] used the root-mean-square error to measure the deviation, and, based on this, they established a PV system parameter optimization model as follows:
min R M S E k ( X ) = 1 N k i = 1 N k h k i ( U O i V , I O i A , X ) 2 ,
where X represents the set of unknown parameters, k = 1 represents the label of the SDM module model, and Nk represents the amount of data obtained for the K-th model. h k ( U O i V , I O i A , X ) represents the error when the i-th output voltage and current of the k type model are U O V , I O A , respectively.
h k ( U O i V , I O i A , X ) = I O i , k l e f t A I O i , k r i g h t , A
where I O i , k l e f t A and I O i , k r i g h t A represent the left and right ends of the output current, respectively, FLAS is used to find the vector X that minimizes RMSEk(X). The absolute error (IAE) and relative error are used to evaluate the performance of the algorithm more accurately.
I A E = | I O A I O m A |
RE = I O A I O m A I O A
where I O A is a real current value measured and I O m A is a model current value calculated when the parameter is optimized.
In the experiment, the current and voltage values obtained with Nk = 26 pairs of French RTC photovoltaic cells at temperatures of 1000 W/m2 and 33 °C were used as experimental data to estimate SDM and DDM model parameters. The current and voltage values of 36 polycrystalline silicon cells at 45 °C and low irradiance of 1000 W/m2 were connected as experimental data to estimate [69]. Table 21 lists some relevant parameters of SDM [70], Lb and Ub, on behalf of the upper and lower bounds. The parameters of all remaining algorithms are the same.
Table 22 shows that FLAS performs well in performance evaluation compared to other MAs, significantly improving processing efficiency and achieving higher accuracy. This result undoubtedly strengthens the competitive advantage of FLAS in the field of intelligent computing. Table 22 shows the 26 groups of measured voltage V, current I and power P data of SDM, as well as the results of current Im, absolute current error IAEI and absolute power error IAEP estimated using FLAS; the absolute current error is all less than 1.61E-03. Combined with the curves in Figure 24a,b, the current data Im and voltage data Pm calculated using FLAS are highly close to the actual data I and P. Figure 24c,d shows the IAE and RE of the simulated current; there is a high similarity between the experimental data and the estimated data. The FLAS is a method that can accurately estimate SDM parameters.
At the same time, using FLAS to seek SDM related Parameters, Table 23 shows the comparison results of five parameter values obtained using eight other MAs such as FLA, DMOA [69], IPSO [71], IGWO, ISSA [72], CSA [73], SCHO [74], and TSA to estimate SDM and RMSE. The RMSE value of FLAS is 1.09E-03. In summary, FLAS can find the solution faster, and the optimal solution is more accurate and stable. The results show that FLAS improves the output efficiency of the model.

7. Conclusions and Future Prospects

This paper proposes a multi-strategy augmented Fick’s law optimization algorithm to improve performance in facing high-dimensional and high-complexity problems, which combines the differential mutation strategy, Gaussian local mutation strategy, interweaving-based comprehensive learning strategy, and seagull update strategy. First, in the DO phase, FLAS improves the search diversity by adding differential and Gaussian local variation strategies, which further improves the aggregation efficiency and exploration capability in later iterations. In addition, the improved algorithm can effectively enhance search capability and stochasticity by introducing an integrated cross-based comprehensive learning strategy in the EO phase. Secondly, by introducing the Levy flight strategy in the position update, the Levy distribution can be effectively utilized to generate random steps to improve the search space’s overall randomization ability. Further, influenced by the idea of a seagull algorithm, FLAS introduces a migration strategy in the SSO stage to avoid the transition aggregation of molecules effectively. FLAS compares and analyzes other excellent improved algorithms, and the latest search algorithms on 23 benchmark functions and CEC2020. The results show that FLAS provides dominant results, especially when dealing with multimodal test functions. However, there is still room for further improvement in the ability of FLAS to face unimodal functions. In addition, the FLAS proposed in this paper is applied to seven real-world engineering optimization problems. The results show that the proposed algorithm has advantages in terms of computational power and convergence accuracy. Finally, FLAS is applied to the parameter estimation of solar PV models, and the experimental results demonstrate the applicability and potential of the proposed algorithm in engineering applications.
In future work, we will consider adding strategies at the initial population phase, such as Tent, Cubic chaos mapping, etc., and further improve the optimization capability of algorithm through different adaptive selection parameter values or combining them with other strategies. Enhanced optimized performance through more diverse test sets more challenging engineering applications for detailed testing. In addition, image feature selection [75,76], multi-objective problems [77], image segmentation [78,79], path planning [80,81,82], truss topology optimization [83], and shape optimization [84] can all be experimentally solved with FLAS.

Author Contributions

Conceptualization, G.H. and J.Z.; Methodology, J.Y., G.H. and J.Z.; Software, J.Y.; Validation, J.Y., G.H. and J.Z.; Formal analysis, G.H.; Investigation, J.Y., G.H. and J.Z.; Resources, G.H.; Data curation, J.Y. and J.Z.; Writing—original draft, J.Y., G.H. and J.Z.; Writing—review and editing, J.Y., G.H. and J.Z.; Visualization, J.Y. and G.H.; Supervision, J.Z.; Project administration, G.H. and J.Z.; Funding acquisition, G.H. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (grant Nos. 52375264 and 62376212).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

All data generated or analyzed during the study are included in this published article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Fu, Q. An Algorithm of Unconstrained Optimization Problem. Ph.D. Thesis, Xi’an University of Science and Technology, Xi’an, China, 2008. [Google Scholar]
  2. Hu, G.; Zheng, Y.; Abualigah, L.; Hussien, A.G. DETDO: An adaptive hybrid dandelion optimizer for engineering optimization. Adv. Eng. Inform. 2023, 57, 102004. [Google Scholar] [CrossRef]
  3. Hu, G.; Zhong, J.; Wei, G.; Chang, C.-T. DTCSMO: An efficient hybrid starling murmuration optimizer for engineering applications. Comput. Methods Appl. Mech. Eng. 2023, 405, 115878. [Google Scholar] [CrossRef]
  4. Sun, X.-P. Based on 0-1 integer programming traffic signal control optimization model and algorithm. Comput. Eng. Appl. 2008, 44, 3. [Google Scholar]
  5. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 1944, pp. 1942–1948. [Google Scholar]
  6. Gandomi, A.H.; Alavi, A.H. Krill herd: A new bio-inspired optimization algorithm. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 4831–4845. [Google Scholar] [CrossRef]
  7. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  8. Abdel-Basset, M.; Mohamed, R.; Jameel, M.; Abouhawwash, M. Nutcracker optimizer: A novel nature-inspired metaheuristic algorithm for global optimization and engineering design problems. Knowl.-Based Syst. 2023, 262, 110248. [Google Scholar] [CrossRef]
  9. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  10. Hu, G.; Guo, Y.; Wei, G.; Abualigah, L. Genghis Khan shark optimizer: A novel nature-inspired algorithm for engineering optimization. Adv. Eng. Inform. 2023, 58, 102210. [Google Scholar] [CrossRef]
  11. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Dandelion Optimizer: A nature-inspired metaheuristic algorithm for engineering applications. Eng. Appl. Artif. Intell. 2022, 114, 105075. [Google Scholar] [CrossRef]
  12. Misaghi, M.; Yaghoobi, M. Improved invasive weed optimization algorithm (IWO) based on chaos theory for optimal design of PID controller. J. Comput. Des. Eng. 2019, 6, 284–295. [Google Scholar] [CrossRef]
  13. Cheraghalipour, A.; Hajiaghaei-Keshteli, M.; Paydar, M.M. Tree Growth Algorithm (TGA): A novel approach for solving optimization problems. Eng. Appl. Artif. Intell. 2018, 72, 393–414. [Google Scholar] [CrossRef]
  14. Zhang, Q.; Gao, H.; Zhan, Z.-H.; Li, J.; Zhang, H. Growth Optimizer: A powerful metaheuristic algorithm for solving continuous and discrete global optimization problems. Knowl.-Based Syst. 2023, 261, 110206. [Google Scholar] [CrossRef]
  15. Abdel-Basset, M.; El-Shahat, D.; Jameel, M.; Abouhawwash, M. Young’s double-slit experiment optimizer: A novel metaheuristic optimization algorithm for global and constraint optimization problems. Comput. Methods Appl. Mech. Eng. 2023, 403, 115652. [Google Scholar] [CrossRef]
  16. Goodarzimehr, V.; Shojaee, S.; Hamzehei-Javaran, S.; Talatahari, S. Special Relativity Search: A novel metaheuristic method based on special relativity physics. Knowl.-Based Syst. 2022, 257, 109484. [Google Scholar] [CrossRef]
  17. Zhao, W.; Wang, L.; Zhang, Z. Atom search optimization and its application to solve a hydrogeologic parameter estimation problem. Knowl.-Based Syst. 2019, 163, 283–304. [Google Scholar] [CrossRef]
  18. Siddique, N.; Adeli, H. Nature-Inspired Chemical Reaction Optimisation Algorithms. Cogn. Comput. 2017, 9, 411–422. [Google Scholar] [CrossRef] [PubMed]
  19. Salawudeen, A.T.; Mu’azu, M.B.; Sha’aban, Y.A.; Adedokun, A.E. A Novel Smell Agent Optimization (SAO): An extensive CEC study and engineering application. Knowl.-Based Syst. 2021, 232, 107486. [Google Scholar] [CrossRef]
  20. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray Optimization. Comput. Struct. 2012, 112–113, 283–294. [Google Scholar] [CrossRef]
  21. Hu, G.; Zhong, J.; Du, B.; Wei, G. An enhanced hybrid arithmetic optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 394, 114901. [Google Scholar] [CrossRef]
  22. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  23. Moustafa, G.; Tolba, M.A.; El-Rifaie, A.M.; Ginidi, A.; Shaheen, A.M.; Abid, S. A Subtraction-Average-Based Optimizer for Solving Engineering Problems with Applications on TCSC Allocation in Power Systems. Biomimetics 2023, 8, 332. [Google Scholar] [CrossRef] [PubMed]
  24. Satapathy, S.; Naik, A. Social group optimization (SGO): A new population evolutionary optimization technique. Complex Intell. Syst. 2016, 2, 173–203. [Google Scholar] [CrossRef]
  25. Kumar, M.; Kulkarni, A.J.; Satapathy, S.C. Socio evolution & learning optimization algorithm: A socio-inspired optimization methodology. Future Gener. Comput. Syst. 2018, 81, 252–272. [Google Scholar]
  26. Das, B.; Mukherjee, V.; Das, D. Student psychology based optimization algorithm: A new population based optimization algorithm for solving optimization problems. Adv. Eng. Softw. 2020, 146, 102804. [Google Scholar] [CrossRef]
  27. Hashim, F.A.; Mostafa, R.R.; Hussien, A.G.; Mirjalili, S.; Sallam, K.M. Fick’s Law Algorithm: A physical law-based algorithm for numerical optimization. Knowl.-Based Syst. 2023, 260, 110146. [Google Scholar] [CrossRef]
  28. Li, C.; Li, J.; Chen, H.; Heidari, A.A. Memetic Harris Hawks Optimization: Developments and perspectives on project scheduling and QoS-aware web service composition. Expert Syst. Appl. 2021, 171, 114529. [Google Scholar] [CrossRef]
  29. Liang, B.X.; Zhao, Y.L.; Li, Y. A hybrid molecule swarm optimization with crisscross learning strategy. Eng. Appl. Artif. Intell. 2021, 105, 104418. [Google Scholar] [CrossRef]
  30. Hu, G.; Chen, L.; Wang, X.; Wei, G. Differential Evolution-Boosted Sine Cosine Golden Eagle Optimizer with Lévy Flight. J. Bionic Eng. 2022, 19, 1850–1885. [Google Scholar] [CrossRef]
  31. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl.-Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  32. Huang, Q.; Ding, H.; Razmjooy, N. Oral cancer detection using convolutional neural network optimized by combined seagull optimization algorithm. Biomed. Signal Process. Control 2024, 87, 105546. [Google Scholar] [CrossRef]
  33. Li, X.; Chen, S. Improved Differential Evolution Algorithm for B-spline Curve and Surface Fitting Problem. Comput. Appl. Softw. 2018, 35, 275–281+298. [Google Scholar]
  34. Xu, L.; Song, B.; Cao, M. An improved particle swarm optimization algorithm with adaptive weighted delay velocity. Syst. Sci. Control Eng. 2021, 9, 188–197. [Google Scholar] [CrossRef]
  35. Abdel-Basset, M.; Mohamed, R.; Sallam, K.M.; Chakrabortty, R.K. Light Spectrum Optimizer: A Novel Physics-Inspired Metaheuristic Optimization Algorithm. Mathematics 2022, 10, 3466. [Google Scholar] [CrossRef]
  36. Abualigah, L.; Diabat, A.; Mirjalili, S.; Elaziz, M.A.; Gandomi, A.H. The Arithmetic Optimization Algorithm. Comput. Meth. Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  37. Houssein, E.H.; Abdelkareem, D.A.; Emam, M.M.; Hameed, M.A.; Younan, M. An efficient image segmentation method for skin cancer imaging using improved golden jackal optimization algorithm. Comput. Biol. Med. 2022, 149, 106075. [Google Scholar] [CrossRef]
  38. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S. An improved grey wolf optimizer for solving engineering problems. Expert Syst. Appl. 2021, 166, 113917. [Google Scholar] [CrossRef]
  39. Hussain, K.; Salleh, M.N.M.; Cheng, S.; Shi, Y. On the exploration and exploitation in popular swarm-based metaheuristic algorithms. Neural Comput. Appl. 2019, 31, 7665–7683. [Google Scholar] [CrossRef]
  40. Morales-Castañeda, B.; Zaldivar, D.; Cuevas, E.; Fausto, F.; Rodríguez, A. A better balance in metaheuristic algorithms: Does it exist? Swarm Evol. Comput. 2020, 54, 100671. [Google Scholar] [CrossRef]
  41. Jones, D.R.; Martins, J.R.R.A. The DIRECT algorithm: 25 years Later. J. Glob. Optim. 2021, 79, 521–566. [Google Scholar] [CrossRef]
  42. Sergeyev, Y.D.; Kvasov, D.E.; Mukhametzhanov, M.S. On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 2018, 8, 453. [Google Scholar] [CrossRef]
  43. Sergeyev, Y.D.; Kvasov, D.E.; Mukhametzhanov, M.S. Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms. Math. Comput. Simul. 2017, 141, 96–109. [Google Scholar] [CrossRef]
  44. Jiang-Hong, D.; Bing, Y. Marine gear box multistage planetary gear transmission system design. Ship Sci. Technol. 2019, 41, 94–96. [Google Scholar]
  45. Liu, X.; Wan, K.; Jin, D.; Gui, X. Development of a Throughflow-Based Simulation Tool for Preliminary Compressor Design Considering Blade Geometry in Gas Turbine Engine. Appl. Sci. 2021, 11, 422. [Google Scholar] [CrossRef]
  46. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate swarm algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intel. 2020, 90, 103541. [Google Scholar] [CrossRef]
  47. Fauzi, H.; Batool, U. A Three-bar Truss Design using Single-solution Simulated Kalman Filter Optimizer. MEKATRONIKA 2019, 1, 98–102. [Google Scholar] [CrossRef]
  48. Hu, L.S. On Piston Rod System Operation Analysis and Piston Structure Optimization Research. Appl. Mech. Mater. 2014, 513–517, 4147–4151. [Google Scholar] [CrossRef]
  49. Zhao, Y.; Song, M. Application and Analysis of Reliability Method in Pressure Vessel Design. Chem. Eng. Des. 2002, 12, 24–26. [Google Scholar]
  50. Hu, G.; Yang, R.; Abbas, M.; Wei, G. BEESO: Multi-strategy boosted snake-inspired optimizer for engineering applications. J. Bionic Eng. 2023, 20, 1791–1827. [Google Scholar] [CrossRef]
  51. Hu, G.; Du, B.; Wang, X.; Wei, G. An enhanced black widow optimization algorithm for feature selection. Knowl.-Based Syst. 2022, 235, 107638. [Google Scholar] [CrossRef]
  52. Jia, H.; Peng, X.; Lang, C. Remora optimization algorithm. Expert Syst. Appl. 2021, 185, 115665. [Google Scholar] [CrossRef]
  53. Jia, H.; Rao, H.; Wen, C.; Mirjalili, S. Crayfish optimization algorithm. Artif. Intell. Rev. 2023, 56, 1919–1979. [Google Scholar] [CrossRef]
  54. Hu, G.; Zhu, X. An improved marine predators algorithm for shape optimization of developable Ball surfaces. Eng. Appl. Artif. Intell. 2021, 105, 104417. [Google Scholar] [CrossRef]
  55. Zhong, C.; Li, G.; Meng, Z. Beluga whale optimization: A novel nature-inspired metaheuristic algorithm. Knowl.-Based Syst. 2022, 251, 109215. [Google Scholar] [CrossRef]
  56. Vasantharaj, A.; Rani, P.S.; Huque, S.; Raghuram, K.S.; Ganeshkumar, R.; Shafi, S.N. Automated Brain Imaging Diagnosis and Classification Model using Rat Swarm Optimization with Deep Learning based Capsule Network. Int. J. Image Graph. 2021, 23, 2240001. [Google Scholar] [CrossRef]
  57. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  58. Hu, G.; Yang, R.; Wei, G. Hybrid chameleon swarm algorithm with multi-strategy: A case study of degree reduction for disk Wang-Ball curves. Math. Comput. Simul. 2023, 206, 709–769. [Google Scholar] [CrossRef]
  59. Jia, H.; Lu, C. Guided learning strategy: A novel update mechanism for metaheuristic algorithms design and improvement. Knowl.-Based Syst. 2024, 286, 111402. [Google Scholar] [CrossRef]
  60. Jia, H.; Lu, C.; Xing, Z. Memory backtracking strategy: An evolutionary updating mechanism for meta-heuristic algorithms. Swarm Evol. Comput. 2024, 84, 101456. [Google Scholar] [CrossRef]
  61. Li, S.M.; Chen, H.L.; Wang, M.J.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  62. Gandomi, A.H.; Yang, X.S.; Alavi, A.H. Cuckoo search algorithm: A/656 meta-heuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  63. Hu, G.; Li, M.; Wang, X.; Wei, G.; Chang, C.T. An enhanced manta ray foraging optimization algorithm for shape optimization of complex CCG-Ball curves. Knowl.-Based Syst. 2022, 240, 108071. [Google Scholar] [CrossRef]
  64. Toledo, F.J.; Galiano, V.; Blanes, J.M.; Herranz, V.; Batzelis, E. Photovoltaic single-diode model parametrization. An application to the calculus of the Euclidean distance to an I–V curve. Math. Comput. Simul. 2023. [Google Scholar] [CrossRef]
  65. Beskirli, A.; Daǧ, İ. An efficient tree seed inspired algorithm for parameter estimation of Photovoltaic models—ScienceDirect. Energy Rep. 2022, 8, 291–298. [Google Scholar] [CrossRef]
  66. Deotti, L.M.P.; da Silva, I.C. A survey on the parameter extraction problem of the photovoltaic single diode model from a current–voltage curve. Sol. Energy 2023, 263, 111930. [Google Scholar] [CrossRef]
  67. Wang, Y. Study on the Effect of Non-Uniform Temperature and Irradiation Distribution on Photoelectric Performance of Solar PV/T System. Ph.D. Thesis, University of Science and Technology of China, Hefei, China, 25 October 2023. [Google Scholar]
  68. Jiao, S.; Chong, G.; Huang, C.; Hu, H.; Wang, M.; Heidari, A.A.; Chen, H.; Zhao, X. Orthogonally adapted Harris Hawk Optimization for parameter estimation of photovoltaic models. Energy 2020, 203, 117804. [Google Scholar] [CrossRef]
  69. Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Dwarf Mongoose Optimization Algorithm. Comput. Methods Appl. Mech. Eng. 2022, 391, 114570. [Google Scholar] [CrossRef]
  70. Sultan, H.M.; Menesy, A.S.; Kamel, S.; Korashy, A.; Almohaimeed, S.; Abdel-Akher, M. An improved artificial ecosystem optimization algorithm for optimal conFiguration of a hybrid PV/WT/FC energy system. Alex. Eng. J. 2021, 60, 1001–1025. [Google Scholar] [CrossRef]
  71. Nzale, W.; Ashourian, H.; Mahseredjian, J.; Gras, H. A tool for automatic determination of model parameters using molecule swarm optimization. Electr. Power Syst. Res. 2023, 219, 109258. [Google Scholar] [CrossRef]
  72. Li, B.; Wang, H. Multi-objective sparrow search algorithm: A novel algorithm for solving complex multi-objective optimisation problems. Expert Syst. Appl. 2022, 210, 118414. [Google Scholar] [CrossRef]
  73. Braik, M.S. Chameleon Swarm Algorithm: A Bio-inspired Optimizer for Solving Engineering Design Problems. Expert Syst. Appl. 2021, 174, 114685. [Google Scholar] [CrossRef]
  74. Bai, J.; Li, Y.; Zheng, M.; Khatir, S.; Benaisa, B.; Abualigah, L.; Wahab, M.A. A Sinh Cosh Optimizer. Knowl.-Based Syst. 2023, 282, 111081. [Google Scholar] [CrossRef]
  75. Hu, G.; Zhong, J.; Wang, X.; Wei, G. Multi-strategy assisted chaotic coot-inspired optimization algorithm for medical feature selection: A cervical cancer behavior risk study. Comput. Biol. Med. 2022, 151, 106239. [Google Scholar] [CrossRef] [PubMed]
  76. Houssein, E.H.; Oliva, D.; Celik, E.; Emam, M.M.; Ghoniem, R.M. Boosted sooty tern optimization algorithm for global optimization and feature selection. Expert Syst. Appl. 2023, 213 B, 119015. [Google Scholar] [CrossRef]
  77. Zhao, W.; Zhang, Z.; Mirjalili, S.; Wang, L.; Khodadadi, N.; Mirjalili, S.M. An effective multi-objective artificial hummingbird algorithm with dynamic elimination-based crowding distance for solving engineering design problems. Comput. Methods Appl. Mech. Eng. 2022, 398, 115223. [Google Scholar] [CrossRef]
  78. Yu, X.; Wu, X. Ensemble grey wolf Optimizer and its application for image segmentation. Expert Syst. Appl. 2022, 209, 118267. [Google Scholar] [CrossRef]
  79. Houssein, E.H.; Hussain, K.; Abualigah, L.; Abd Elaziz, M.; Alomoush, W.; Dhiman, G.; Djenouri, Y.; Cuevas, E. An improved opposition-based marine predators algorithm for global optimization and multilevel thresholding image segmentation. Knowl.-Based Syst. 2021, 229, 107348. [Google Scholar] [CrossRef]
  80. Liu, G.; Shu, C.; Liang, Z.; Peng, B.; Cheng, L. A modified sparrow search algorithm with application in 3D route planning for UAV. Sensors 2021, 21, 1224. [Google Scholar] [CrossRef]
  81. Yu, X.; Li, C.; Zhou, J. A constrained differential evolution algorithm to solve UAV path planning in disaster scenarios. Knowl.-Based Syst. 2020, 204, 106209. [Google Scholar] [CrossRef]
  82. Hu, G.; Huang, F.; Seyyedabbasi, A.; Wei, G. Enhanced multi-strategy bottlenose dolphin optimizer for UAVs path planning. Eng. Appl. Math. Model. 2024, 130, 243–271. [Google Scholar] [CrossRef]
  83. Assimi, H.; Jamali, A. A hybrid algorithm coupling genetic programming and Nelder–Mead for topology and size optimization of trusses with static and dynamic constraints. Expert Syst. Appl. 2018, 95, 127–141. [Google Scholar] [CrossRef]
  84. Zheng, J.; Ji, X.; Ma, Z.; Hu, G. Construction of Local-Shape-Controlled Quartic Generalized Said-Ball Model. Mathematics 2023, 11, 2369. [Google Scholar] [CrossRef]
Figure 1. MAs classification figure.
Figure 1. MAs classification figure.
Biomimetics 09 00205 g001
Figure 2. Molecule diffusion direction.
Figure 2. Molecule diffusion direction.
Biomimetics 09 00205 g002
Figure 3. Molecule diffusion direction.
Figure 3. Molecule diffusion direction.
Biomimetics 09 00205 g003
Figure 4. Molecule concentration equilibrium phase.
Figure 4. Molecule concentration equilibrium phase.
Biomimetics 09 00205 g004
Figure 5. Collision avoidance of molecule.
Figure 5. Collision avoidance of molecule.
Biomimetics 09 00205 g005
Figure 6. Algorithm flow chart of FLAS.
Figure 6. Algorithm flow chart of FLAS.
Biomimetics 09 00205 g006
Figure 7. EXE diagram on 23 functions.
Figure 7. EXE diagram on 23 functions.
Biomimetics 09 00205 g007
Figure 8. Exploration and development map on CEC2022.
Figure 8. Exploration and development map on CEC2022.
Biomimetics 09 00205 g008
Figure 9. Convergence comparison diagram of FLAS and others.
Figure 9. Convergence comparison diagram of FLAS and others.
Biomimetics 09 00205 g009aBiomimetics 09 00205 g009b
Figure 10. Box plot of FLAS and MAs on 23 benchmark functions.
Figure 10. Box plot of FLAS and MAs on 23 benchmark functions.
Biomimetics 09 00205 g010aBiomimetics 09 00205 g010b
Figure 11. Convergence curves of different algorithms on CEC2020 test set.
Figure 11. Convergence curves of different algorithms on CEC2020 test set.
Biomimetics 09 00205 g011
Figure 12. Box plot of different algorithms on CEC2020 test set.
Figure 12. Box plot of different algorithms on CEC2020 test set.
Biomimetics 09 00205 g012
Figure 13. Radar map drawn with the rankings of different algorithms on the CEC2020 test set.
Figure 13. Radar map drawn with the rankings of different algorithms on the CEC2020 test set.
Biomimetics 09 00205 g013
Figure 14. Operational zones built using 30 runs performed using the FLAS method and operational characteristics for the DIRECT method on ten CEC2020 functions. The upper and the lower boundaries of the zone are shown as dark blue curves.
Figure 14. Operational zones built using 30 runs performed using the FLAS method and operational characteristics for the DIRECT method on ten CEC2020 functions. The upper and the lower boundaries of the zone are shown as dark blue curves.
Biomimetics 09 00205 g014
Figure 15. Convergence results for FLAS in CEC2020 test functions.
Figure 15. Convergence results for FLAS in CEC2020 test functions.
Biomimetics 09 00205 g015
Figure 16. Reducer design problems.
Figure 16. Reducer design problems.
Biomimetics 09 00205 g016
Figure 17. Design problems of three-bar truss design.
Figure 17. Design problems of three-bar truss design.
Biomimetics 09 00205 g017
Figure 18. Design problem of gear group.
Figure 18. Design problem of gear group.
Biomimetics 09 00205 g018
Figure 19. Schematic diagram of piston rod design.
Figure 19. Schematic diagram of piston rod design.
Biomimetics 09 00205 g019
Figure 20. Design problems of gas transmission compressor.
Figure 20. Design problems of gas transmission compressor.
Biomimetics 09 00205 g020
Figure 21. Pressure vessel design problems.
Figure 21. Pressure vessel design problems.
Biomimetics 09 00205 g021
Figure 22. Problem of stepped cone pulley.
Figure 22. Problem of stepped cone pulley.
Biomimetics 09 00205 g022
Figure 23. Schematic drawing of SDM.
Figure 23. Schematic drawing of SDM.
Biomimetics 09 00205 g023
Figure 24. Schematic drawing of SDM.
Figure 24. Schematic drawing of SDM.
Biomimetics 09 00205 g024aBiomimetics 09 00205 g024b
Table 1. Initial parameter Settings of all algorithms.
Table 1. Initial parameter Settings of all algorithms.
AlgorithmsParameterParameter Value
FLASConstant (K1, K2, K3, K4, K5, D)K1 = 0.5; K2 = 2; K3 = 0.1; K4 = 0.2; K5 = 2; D = 0.01;
PSO_ELPMConstant (alpha, delta, u)alpha = 0.1; delta = 0.1; u = 0.0265;
HHOInitial energy E0E0 = 2 * rand () − 1;
DEControl parameter Cr; Variation scaling factor FCr = 0.4; F = 0.5;
LSOConstant (Ps; Pe; Ph; B)Ps = 0.05; Pe = 0.6; Ph = 0.4; B = 0.05;
FLAConstant (G1, G2, G3, G4, G5, D)G1 = 0.5; G2 = 2; G3 = 0.1; G4 = 0.2; G5 = 2; D = 0.01;
BWOConstant (alpha, KD)alpha = 3/2; KD = 0.05;
IGJOcontrol parameter E1E1 = 1.5 * (1 − (l/Maxiteration));
AOAConstant (B1, B2, B3, B4, u, l)B1 = 2; B2 = 6; B3 = 1; B4 = 2; u = 0.9; l = 0.1;
IGWOControl parameter aa = 2 − iter * ((2)/Maxiteration);
Table 2. Numerical results of FLAS for different values of the parameter alfa.
Table 2. Numerical results of FLAS for different values of the parameter alfa.
Functionsalfa, rank
0.01 + 0.04 × rand0.05 + 0.04 × rand0.1 + 0.04 × rand0.15+0.04 × rand0.2 + 0.04 × rand
cec0115,077.0282612,709.1942117,920.146971015,364.27886713,661.362273
cec021574.93832251575.81815761560.86471231615.71291291561.8534614
cec03726.62637892727.68280718727.48585055726.71051883726.0414411
cec041900119001190011900119001
cec0561,152.90341919,708.71303427,162.80712664,299.90351034,461.354418
cec061828.744955101752.7596841759.33933871755.90561951746.3173333
cec0719,049.10676108395.313019312,561.99941810,579.18166714.908141
cec082303.17465362299.07283922298.03955112299.30441642357.74902210
cec092741.01261332740.51899722765.743281102737.71315112759.2241829
cec102934.14723972930.80635162935.53238182925.94217722912.6184761
Total rank5937594841
Functionsalfa, rank
0.25 + 0.04 × rand0.3 + 0.04 × rand0.35 + 0.04 × rand0.4 + 0.04 × rand0.45 + 0.04 × rand
cec0114,951.16004515,854.54192912,820.64997214,083.00748415,640.962528
cec021531.4517221518.09014511598.22886181582.98207371628.4001810
cec03729.773885710727.63154897729.2285099727.36921464727.60559746
cec041900119001190011900119001
cec0512,017.4934122,570.8507518,248.68068328,814.82339715,926.230032
cec061781.72730591725.54371521764.46415781758.57206161716.5177311
cec0710,188.74183511,290.3988478892.82558746925.282563213,162.080889
cec082304.09165382299.09266132353.81684492303.36119172303.0463635
cec092749.04957662745.30421142752.26489382747.98830952749.3200347
cec102936.867379102930.44795452935.92865192925.97736132927.8464784
Total rank5744614653
Table 3. Experimental results of FLAS and MAs on 23 benchmark functions.
Table 3. Experimental results of FLAS and MAs on 23 benchmark functions.
FIndexFLASPSO-ELPMHHODELSOFLABWOIGJOAOAIGWO
F1Best03.950E+036.085E-1222.748E+038.675E-0909.594E-2532.464E-1317.767E-1411.831E-23
Worst2.619e-3222.060E+049.563E-1035.982E+037.578E+0302.230E-2413.054E-1233.069E-1025.133E-03
Mean9.881e-3241.427E+044.099E-1044.134E+033.170E+0308.802E-2432.500E-1241.024E-1031.805E-04
Std0.000E+003.540E+031.758E-1038.371E+022.459E+03006.603E-1245.603E-1039.357E-04
Rank21059813467
F2Best03.037E+013.384E-627.800E+000.000E+0001.053E-1273.375E-713.268E-785.971E-20
Worst6.840E-1592.308E+031.225E-512.355E+011.739E-0105.219E-1222.055E-667.706E-545.510E-03
Mean2.284E-1604.568E+024.974E-531.669E+015.797E-0305.908E-1237.307E-682.574E-554.294E-04
Std1.249E-1596.695E+022.237E-523.748E+003.175E-0201.260E-1223.744E-671.407E-541.282E-03
Rank21069813457
F3Best06.289E+032.029E-1084.306E+030.000E+0009.010E-2492.210E-761.463E-1321.540E-05
Worst2.303E-2982.567E+041.061E-841.567E+049.954E+0302.576E-2327.453E-655.338E-871.932E+02
Mean7.677E-3001.509E+043.536E-868.041E+034.713E+0309.617E-2343.179E-661.779E-881.772E+01
Std05.170E+031.937E-852.476E+032.958E+0300.000E+001.362E-659.745E-884.584E+01
Rank21059813647
F4Best03.015E+011.020E-582.703E+013.305E+0006.763E-1268.298E-516.909E-648.449E-06
Worst9.548E-1597.629E+015.870E-496.019E+015.202E+017.733E-094.473E-1191.876E-463.971E-497.479E+00
Mean6.732E-1606.482E+012.668E-504.398E+013.572E+016.070E-102.517E-1201.379E-473.008E-504.429E-01
Std2.286E-1599.157E+001.101E-496.845E+001.364E+011.643E-098.493E-1203.731E-479.010E-501.415E+00
Rank11039862547
F5Best1.437E-056.542E+064.616E-053.313E+058.766E+008.635E-034.710E-035.752E+008.266E+006.671E+00
Worst7.160E-026.327E+076.772E-034.984E+063.993E+061.389E+011.008E-018.701E+008.938E+003.549E+02
Mean5.775E-033.245E+071.207E-032.464E+063.909E+059.913E-013.377E-026.786E+008.658E+003.349E+01
Std1.444E-021.445E+071.397E-031.366E+068.554E+052.877E+002.293E-026.604E-011.285E-018.166E+01
Rank21019843567
F6Best2.477E-107.672E+031.834E-071.539E+032.046E+005.829E-051.198E-046.650E-063.887E-011.723E-02
Worst2.130E-051.905E+045.891E-057.015E+038.885E+035.203E-035.261E-044.881E-011.114E+004.166E-01
Mean1.870E-061.402E+041.420E-054.016E+032.859E+031.066E-033.386E-041.156E-017.584E-018.227E-02
Std3.940E-063.378E+031.601E-051.281E+032.764E+031.206E-031.304E-041.521E-012.005E-019.068E-02
Rank11029843675
F7Best2.12E-071.40755.83E-060.354490.000128964.02E-056.63E-066.90E-066.55E-051.20E-04
Worst1.62E-049.99392.12E-041.70410.0110291.66E-033.09E-046.45E-041.06E-030.011526
Mean4.28E-055.32847.13E-051.02880.00230524.50E-048.81E-051.65E-043.83E-040.0013873
Std3.57E-052.63616.01E-050.392210.00251444.02E-046.80E-051.67E-042.60E-040.0020545
Rank11029863457
F8Best−4.190E+03−2.137E+03−4.190E+03−2.399E+03−2.064E+48−4.190E+03−4.186E+03−3.179E+03−2.314E+08−3.780E+03
Worst−3.479E+03−1.806E+03−2.648E+03−1.669E+03−8.041E+07−4.071E+03−3.451E+03−1.744E+03−4.084E+03−2.179E+03
Mean−4.099E+03−1.827E+03−4.138E+03−2.005E+03−6.881E+46−4.178E+03−4.000E+03−2.389E+03−1.785E+07−2.977E+03
Std1.695E+026.699E+012.815E+021.691E+023.768E+473.621E+012.435E+023.671E+025.104E+074.386E+02
Rank51049136827
F9Best07.469E+0104.064E+01000002.140E+00
Worst01.392E+0209.267E+014.704E+019.950E-01002.080E+014.135E+01
Mean01.142E+0207.356E+012.100E+003.320E-02003.460E+001.493E+01
Std01.379E+0101.196E+018.890E+001.820E-01007.180E+001.041E+01
Rank11029653478
F10Best4.440E-161.846E+014.440E-161.173E+014.440E-164.000E-154.440E-164.440E-164.440E-162.040E-10
Worst4.440E-161.997E+014.440E-161.805E+011.853E+017.550E-154.440E-164.000E-154.000E-155.325E+00
Mean4.440E-161.973E+014.440E-161.671E+011.130E+014.230E-154.440E-163.520E-151.630E-151.842E-01
Std03.811E-0101.312E+005.290E+009.010E-1601.230E-151.700E-159.714E-01
Rank11029863547
F11Best07.400E+0101.549E+012.124E+0000001.240E-02
Worst01.818E+0206.406E+018.576E+012.120E-01005.339E-011.011E+00
Mean01.307E+0203.862E+014.076E+014.902E-02004.912E-023.595E-01
Std02.749E+0109.445E+002.054E+015.351E-02001.148E-012.891E-01
Rank11028953467
F12Best1.190E-101.526E+071.040E-082.907E+044.620E-019.240E-068.260E-061.340E-066.810E-029.740E-04
Worst7.550E-051.437E+083.190E-051.130E+074.729E+062.530E-041.770E-047.870E-021.040E+005.965E+00
Mean5.090E-066.317E+077.580E-061.437E+065.600E+051.110E-048.630E-053.130E-022.710E-013.243E-01
Std1.380E-053.489E+078.600E-062.122E+061.400E+065.380E-054.280E-052.080E-021.890E-011.155E+00
Rank11029843567
F13Best3.220E-072.630E+072.670E-087.094E+058.820E-016.370E-053.270E-051.500E-052.360E-018.880E-03
Worst8.940E-053.183E+082.920E-043.475E+072.373E+071.230E-023.310E-043.030E-018.270E-014.617E+00
Mean1.550E-051.349E+082.950E-058.968E+062.170E+062.030E-031.330E-049.420E-024.600E-012.836E-01
Std2.280E-056.868E+075.680E-057.319E+065.490E+063.800E-037.990E-059.110E-021.490E-018.684E-01
Rank11029843576
F14Best9.980E-013.420E+009.980E-019.985E-019.980E-019.980E-019.980E-019.980E-019.980E-019.980E-01
Worst9.980E-014.169E+022.982E+009.161E+008.847E+009.980E-019.980E-011.267E+011.001E+001.040E+00
Mean9.980E-015.760E+011.230E+003.294E+003.278E+009.980E-019.980E-014.719E+009.982E-019.996E-01
Std1.725E-109.906E+015.005E-012.281E+002.423E+006.710E-105.007E-104.256E+006.394E-047.675E-03
Rank11068732945
F15Best3.080E-041.012E-023.080E-041.828E-031.050E-033.150E-043.090E-043.080E-043.920E-044.270E-04
Worst3.990E-046.114E-011.340E-032.529E-022.603E-026.330E-025.020E-041.220E-032.090E-031.546E-03
Mean3.180E-049.282E-023.950E-041.141E-024.840E-033.440E-033.350E-044.620E-048.620E-047.579E-04
Std1.690E-051.052E-012.460E-046.806E-035.560E-031.190E-023.890E-052.770E-043.870E-042.237E-04
Rank11039872465
F16Best−1.032E+00−9.136E-01−1.032E+00−1.032E+00−1.032E+00−1.032E+00−1.032E+00−1.032E+00−1.032E+00−1.032E+00
Worst−1.032E+004.132E+00−1.032E+00−9.042E-01−1.009E+00−1.032E+00−1.031E+00−1.032E+00−1.031E+00−1.031E+00
Mean−1.032E+001.501E-01−1.032E+00−1.016E+00−1.026E+00−1.032E+00−1.032E+00−1.032E+00−1.032E+00−1.032E+00
Std7.278E-079.995E-011.288E-112.369E-025.881E-031.012E-068.137E-051.017E-071.230E-042.068E-04
Rank31019846257
F17Best3.980E-014.110E-013.980E-013.981E-013.980E-013.980E-013.980E-013.980E-013.980E-013.980E-01
Worst3.980E-013.709E+003.980E-015.625E-015.304E-014.000E-014.110E-014.000E-014.840E-014.012E-01
Mean3.980E-011.482E+003.980E-014.542E-014.360E-013.980E-014.000E-013.980E-014.120E-013.983E-01
Std1.000E-058.000E-011.740E-065.073E-023.930E-024.620E-043.130E-033.360E-042.380E-027.667E-04
Rank21019846375
F18Best33.155E+0033.004E+003.000E+003.000E+003.000E+003.000E+003.000E+003.000E+00
Worst31.319E+0239.748E+003.744E+003.000E+004.958E+003.000E+009.975E+003.030E+00
Mean34.081E+0133.959E+003.086E+003.000E+003.396E+003.000E+003.414E+003.003E+00
Std5.161E-063.400E+012.599E-081.305E+001.498E-012.336E-055.152E-019.271E-071.331E+006.128E-03
Rank31019647285
F19Best−3.860E+00−3.782E+00−3.860E+00−3.861E+00−3.860E+00−3.860E+00−3.860E+00−3.860E+00−3.860E+00−3.860E+00
Worst−3.860E+00−2.367E+00−3.860E+00−3.784E+00−3.719E+00−3.090E+00−3.850E+00−3.850E+00−3.810E+00−3.860E+00
Mean−3.860E+00−3.393E+00−3.860E+00−3.841E+00−3.820E+00−3.840E+00−3.860E+00−3.860E+00−3.850E+00−3.862E+00
Std7.220E-073.343E-011.240E-031.901E-023.400E-021.410E-012.480E-033.560E-031.370E-027.008E-04
Rank11037985462
F20Best-3.320E+00-2.809E+00-3.310E+00-3.120E+00-3.240E+00-3.320E+00-3.320E+00-3.320E+00-3.200E+00-3.320E+00
Worst-3.200E+00-9.615E-01-2.830E+00-2.306E+00-2.377E+00-3.170E+00-3.200E+00-3.020E+00-2.470E+00-3.156E+00
Mean-3.270E+00-1.882E+00-3.110E+00-2.790E+00-2.870E+00-3.280E+00-3.300E+00-3.180E+00-2.920E+00-3.248E+00
Std6.030E-025.029E-011.300E-011.965E-011.800E-015.970E-022.380E-029.760E-021.800E-015.518E-02
Rank31069821574
F21Best−1.015E+01−1.421E+00−1.013E+01−4.805E+00−1.015E+01−1.012E+01−1.015E+01−1.015E+01−9.951E+00−9.835E+00
Worst−1.015E+01−3.417E-01−5.049E+00−9.464E-01−3.800E+00−2.609E+00−1.004E+01−2.682E+00−2.959E+00−4.089E+00
Mean−1.015E+01−5.840E-01−5.386E+00−2.025E+00−7.079E+00−6.273E+00−1.015E+01−8.647E+00−6.812E+00−6.363E+00
Std1.193E-032.493E-011.264E+009.376E-012.408E+002.592E+002.035E-022.570E+002.154E+001.860E+00
Rank11089472356
F22Best−1.040E+01−1.859E+00−5.088E+00−7.067E+00−1.037E+01−1.034E+01−1.040E+01−1.040E+01−9.108E+00−1.040E+01
Worst−1.040E+01−3.932E-01−5.081E+00−1.151E+00−2.554E+00−1.818E+00−1.034E+01−5.108E+00−2.510E+00−2.494E+00
Mean−1.040E+01−8.135E-01−5.086E+00−2.414E+00−6.329E+00−4.669E+00−1.040E+01−1.022E+01−6.651E+00−7.922E+00
Std6.842E-043.383E-011.585E-031.355E+002.577E+001.901E+001.163E-029.647E-011.875E+001.915E+00
Rank11079682354
F23Best−1.054E+01−1.704E+00−1.053E+01−4.233E+00−1.053E+01−1.053E+01−1.054E+01−1.053E+01−1.023E+01−1.023E+01
Worst−1.053E+01−4.466E-01−5.115E+00−1.337E+00−2.913E+00−1.676E+00−1.050E+01−5.128E+00−2.798E+00−3.736E+00
Mean−1.054E+01−9.062E-01−5.307E+00−2.211E+00−6.205E+00−5.302E+00−1.053E+01−9.990E+00−7.092E+00−8.115E+00
Std7.338E-043.164E-019.861E-016.607E-012.372E+002.517E+005.909E-031.640E+002.140E+001.657E+00
Rank11079682354
Average rank1.710.03.58.87.24.63.34.55.55.9
Final rank11039852467
Table 4. Wilcoxon rank sum test of FLAS and MAs on 23 benchmark functions.
Table 4. Wilcoxon rank sum test of FLAS and MAs on 23 benchmark functions.
FPSO_ELPMHHODELSOFLABWOIGJOAOAIGWO
F12.366E-122.366E-122.366E-122.366E-121.608E-012.366E-122.366E-122.366E-122.366E-12
F22.800E-112.800E-112.800E-118.496E-064.788E-082.800E-112.800E-112.800E-112.800E-11
F31.956E-111.956E-111.956E-111.624E-102.934E-051.956E-111.956E-111.956E-111.956E-11
F42.800E-112.800E-112.800E-112.800E-112.904E-022.800E-112.800E-112.800E-112.800E-11
F53.020E-114.119E-013.020E-113.020E-111.464E-106.518E-093.020E-113.020E-113.020E-11
F63.020E-115.186E-073.020E-113.020E-113.020E-113.020E-119.919E-113.020E-113.020E-11
F73.020E-117.013E-023.020E-113.338E-114.200E-101.767E-034.084E-051.094E-103.338E-11
F85.219E-124.290E-013.020E-113.020E-116.097E-031.041E-043.020E-111.695E-094.504E-11
F91.210E-12NaN1.210E-124.190E-023.340E-01NaNNaN1.100E-021.210E-12
F106.870E-13NaN1.210E-121.660E-114.160E-14NaN1.970E-116.180E-041.210E-12
F111.212E-12NaN1.212E-121.212E-121.306E-07NaNNaN6.617E-041.212E-12
F123.020E-119.880E-033.020E-113.020E-118.150E-111.460E-103.470E-103.020E-113.020E-11
F133.020E-117.620E-013.020E-113.020E-113.690E-113.470E-102.440E-093.020E-113.020E-11
F143.020E-118.766E-013.020E-113.338E-115.264E-043.183E-012.372E-103.338E-116.696E-11
F153.020E-118.120E-043.020E-113.020E-111.610E-101.370E-034.840E-023.690E-113.020E-11
F163.020E-114.491E-113.020E-113.020E-111.580E-014.200E-106.787E-021.996E-059.919E-11
F173.02E-114.71E-043.02E-113.02E-112.67E-094.08E-115.27E-054.20E-101.25E-07
F183.02E-113.02E-113.02E-113.02E-111.00E-033.02E-111.25E-073.02E-111.33E-10
F193.020E-113.340E-113.020E-113.020E-114.500E-113.020E-113.340E-113.020E-113.020E-11
F203.020E-112.030E-073.020E-111.210E-101.220E-025.200E-012.030E-073.020E-113.500E-03
F213.020E-113.020E-113.020E-113.338E-113.020E-111.850E-085.494E-113.020E-113.020E-11
F223.020E-113.020E-113.020E-113.020E-113.020E-111.311E-083.690E-113.020E-114.077E-11
F233.020E-113.020E-113.020E-113.020E-113.020E-111.011E-084.504E-113.020E-113.020E-11
+/=/−0/0/230/8/150/0/230/1/221/2/200/5/181/3/200/1/220/0/23
Table 5. Experimental results of FLAS and comparison algorithm on CEC2020.
Table 5. Experimental results of FLAS and comparison algorithm on CEC2020.
FIndexFLASPSO_ELPMHHODELSOFLABWOIGJOAOAIGWO
F1Best7.466E+068.032E+088.379E+071.046E+117.643E+103.711E+079.403E+101.604E+108.590E+103.678E+09
Worst3.003E+073.692E+093.827E+081.682E+111.271E+116.351E+071.101E+114.308E+101.184E+113.938E+10
Mean1.834E+071.714E+091.347E+081.291E+111.065E+114.689E+0703.010E+101.040E+111.814E+10
Std5.277E+066.741E+085.266E+071.481E+101.379E+107.056E+064.320E+096.361E+098.753E+098.636E+09
Rank14310927685
F2Best6.186E+036.341E+037.210E+031.475E+041.399E+045.894E+031.274E+047.608E+031.288E+046.712E+03
Worst9.313E+031.021E+041.018E+041.733E+041.679E+049.328E+031.497E+041.540E+041.508E+041.538E+04
Mean7.503E+038.223E+038.590E+031.651E+041.603E+047.749E+031.423E+041.096E+041.413E+041.282E+04
Std8.692E+029.685E+029.668E+025.490E+025.961E+027.884E+024.629E+022.475E+035.424E+022.784E+03
Rank13410928576
F3Best9.779E+021.453E+031.779E+033.401E+032.072E+039.909E+022.012E+031.232E+031.899E+039.895E+02
Worst1.166E+031.768E+032.205E+034.324E+033.361E+031.245E+032.172E+031.778E+032.198E+031.672E+03
Mean1.069E+031.621E+031.964E+033.982E+032.788E+031.094E+032.113E+031.514E+032.097E+031.392E+03
Std5.121E+018.501E+019.869E+012.145E+024.015E+025.291E+013.710E+011.084E+027.650E+011.316E+02
Rank15610928473
F4Best1.900E+031.900E+031.900E+036.834E+051.900E+031.900E+031.900E+031.900E+031.900E+031.903E+03
Worst1.900E+031.900E+031.900E+032.492E+063.795E+051.913E+031.900E+031.900E+031.900E+038.196E+03
Mean1.900E+031.900E+031.900E+031.411E+068.422E+041.906E+031.900E+031.900E+031.900E+032.245E+03
Std0.000E+001.411E-110.000E+004.032E+051.190E+055.373E+000.000E+000.000E+000.000E+001.188E+03
Rank16210973458
F5Best7.665E+052.681E+061.736E+062.247E+082.480E+089.970E+051.527E+083.831E+062.262E+087.404E+06
Worst1.684E+073.189E+072.482E+071.087E+098.578E+082.618E+076.294E+082.282E+087.650E+081.622E+08
Mean7.848E+061.558E+079.098E+064.616E+084.782E+081.156E+073.620E+083.827E+074.674E+084.060E+07
Std4.851E+069.186E+065.653E+061.993E+081.466E+086.664E+061.106E+084.251E+071.494E+083.492E+07
Rank14281037596
F6Best2.789E+033.233E+033.490E+036.714E+036.417E+032.271E+036.842E+033.057E+036.554E+032.715E+03
Worst4.409E+035.953E+035.792E+038.293E+038.958E+034.401E+039.539E+036.044E+039.949E+035.491E+03
Mean3.614E+034.217E+034.474E+037.470E+038.089E+033.434E+038.148E+034.140E+038.213E+034.485E+03
Std4.681E+026.873E+025.240E+024.342E+025.428E+024.185E+026.338E+025.844E+028.839E+026.094E+02
Rank24578193106
F7Best9.050E+058.392E+062.495E+066.475E+071.340E+082.995E+068.500E+076.238E+052.980E+085.972E+06
Worst4.054E+078.249E+075.507E+075.131E+084.631E+085.691E+074.881E+081.121E+081.142E+097.568E+07
Mean1.386E+073.762E+072.044E+072.517E+082.705E+081.966E+072.490E+082.819E+076.584E+082.615E+07
Std1.145E+072.067E+071.614E+071.124E+089.122E+071.555E+079.274E+072.935E+072.329E+081.752E+07
Rank16389275104
F8Best7.792E+037.168E+039.578E+031.662E+041.632E+048.083E+031.528E+049.071E+031.464E+043.095E+03
Worst1.059E+041.369E+041.438E+041.879E+041.840E+041.065E+041.746E+041.694E+041.723E+041.682E+04
Mean8.887E+031.131E+041.139E+041.791E+041.754E+049.282E+031.647E+041.278E+041.615E+041.229E+04
Std6.351E+021.282E+039.061E+025.764E+025.157E+026.825E+024.319E+022.531E+036.701E+024.254E+03
Rank13410928675
F9Best3.177E+033.311E+033.887E+033.612E+033.990E+033.113E+034.100E+033.264E+034.598E+033.004E+03
Worst3.534E+033.643E+034.793E+033.989E+034.444E+033.415E+034.662E+033.650E+035.452E+034.126E+03
Mean3.328E+033.509E+034.212E+033.778E+034.246E+033.279E+034.426E+033.415E+034.992E+033.390E+03
Std8.446E+019.123E+012.386E+027.846E+019.993E+016.390E+011.175E+029.127E+012.591E+022.156E+02
Rank25768194103
F10Best3.030E+033.213E+033.138E+031.578E+041.287E+043.030E+031.252E+044.187E+031.189E+043.747E+03
Worst3.216E+033.797E+033.307E+033.184E+042.565E+043.154E+031.463E+047.903E+031.635E+045.787E+03
Mean3.090E+033.469E+033.219E+032.487E+041.839E+043.095E+031.371E+045.577E+031.414E+044.383E+03
Std4.151E+011.197E+024.649E+014.421E+032.786E+033.541E+015.597E+028.881E+021.293E+034.824E+02
Rank14310927685
Average rank1.24.43.98.98.92.47.34.88.15.1
Final rank14310927586
Table 6. Wilcoxon rank sum test of 10 functions on CEC2020 using FLAS and other MAs.
Table 6. Wilcoxon rank sum test of 10 functions on CEC2020 using FLAS and other MAs.
FAlgorithms
PSO_ELPMHHODELSOFLABWOIGJOAOAIGWO
F13.020E-113.020E-113.020E-113.020E-113.020E-113.020E-113.020E-113.020E-113.020E-11
F26.972E-032.839E-043.020E-113.020E-112.282E-013.020E-111.287E-093.020E-111.429E-08
F33.020E-113.020E-113.020E-113.020E-119.334E-023.020E-113.020E-113.020E-114.616E-10
F43.337E-01NaN1.212E-121.212E-122.213E-06NaNNaNNaN1.212E-12
F59.521E-045.395E-013.020E-113.020E-113.917E-023.020E-117.043E-073.020E-118.101E-10
F61.236E-031.157E-073.020E-113.020E-111.297E-013.020E-115.561E-043.020E-113.805E-07
F72.317E-061.055E-013.020E-113.020E-111.087E-013.020E-117.483E-023.020E-111.680E-03
F83.197E-096.696E-113.020E-113.020E-112.608E-023.020E-111.957E-103.020E-111.175E-04
F91.698E-083.020E-113.020E-113.020E-114.060E-023.020E-112.254E-043.020E-119.626E-02
F103.338E-111.613E-103.020E-113.020E-114.290E-013.020E-113.020E-113.020E-113.020E-11
+/=/−1/4/50/3/70/0/100/0/100/5/50/1/90/3/70/1/90/3/7
Table 7. Optimal results for reducer design problems.
Table 7. Optimal results for reducer design problems.
AlgorithmsOptimal VariableOptimal Weight
xa1xa2xa3xa4xa5xa6xa7
FLAS3.5005 0.7000 17.0000 7.3000 7.8000 3.3512 5.2869 2997.0
PSO_ELPM3.6000 0.7000 17.0000 8.3000 8.3000 3.4051 5.5000 3211.6
HHO3.5202 0.7000 17.0000 7.8000 7.9000 3.3513 5.2913 3015.0
DE3.5358 0.7000 17.0000 7.6000 8.0000 3.3515 5.3040 3029.4
FLA3.5209 0.7000 17.0000 7.3000 8.1000 3.3635 5.2908 3082.1
BWO2.8055 0.7000 17.0000 7.3000 7.8000 3.0775 5.0000 3017.7
RSO3.5850 0.7000 17.0000 7.3000 8.3000 3.4128 5.5000 8.7600E+07
RSA3.5076 0.7000 17.0000 7.3000 7.8000 3.3530 5.3106 3199.4
TSA3.6000 0.7000 17.0000 7.5000 8.1000 3.3719 5.2884 3015.3
Table 8. Statistical results of reducer design problems.
Table 8. Statistical results of reducer design problems.
AlgorithmsMeanStdBestWorst
FLAS3.00E+035.75262997.03.03E+03
PSO_ELPM9.63E+063.49E+063211.61.30E+07
HHO5.13E+031.03E+033015.05.74E+03
DE7.03E+053.83E+063029.42.10E+07
FLA4.78E+064.53E+063082.11.11E+07
BWO3.08E+0331.6943017.73.17E+03
RSO9.80E+073.01E+068.76E+071.00E+08
RSA3.28E+0343.53773199.43.35E+03
TSA3.04E+0312.34933015.33.06E+03
Table 9. Optimal results for TBT.
Table 9. Optimal results for TBT.
AlgorithmsDesign VariableObjective Function Value
x1x2
FLAS0.78610.4068263.46355
PSO_ELPM0.78680.4050263.46389
HHO0.78610.4069263.46343
DE0.78540.4094263.4650
SCA0.78710.4048263.46364
FLA0.78580.4081263.46676
BWO0.75750.4957263.46431
RSO0.78580.4240264.15644
SMA0.78610.4070264.66549
Table 10. Statistical results of TBT.
Table 10. Statistical results of TBT.
Algorithm\IndexMeanStdBestWorst
FLAS263.37650.3850263.46355264.8725
PSO_ELPM263.58560.1413263.46389264.0082
HHO263.49890.6240263.46343263.7109
DE263.54880.2119263.4650264.6178
SCA264.78064.8423263.46364282.5938
FLA265.25432.8059263.46676275.4833
BWO263.70760.1671263.46431264.0034
RSO270.56075.9360264.15644284.3287
SMA268.83241.9769264.66549271.1422
Table 11. Design results of gear group.
Table 11. Design results of gear group.
AlgorithmDesign VariableFitness Value
x1x2x3x4
FLAS191750442.70E-12
PSO_ELPM191650442.70E-12
HHO172049432.70E-12
DE181535522.36E-09
SCA311354522.70E-12
FLA162049442.31E-11
BWO121328382.70E-12
RSO221637591.83E-08
SMA191743503.07E-10
IGWO191750442.70E-12
Table 12. Statistical results of design problems of gear group.
Table 12. Statistical results of design problems of gear group.
Algorithm\IndexMeanStdBestWorst
FLAS4.68E-108.22E-102.70E-123.30E-09
PSO_ELPM7.42E-099.44E-092.70E-122.73E-08
HHO2.29E-093.48E-092.70E-121.83E-08
DE1.26E-072.25E-072.36E-099.98E-07
SCA1.85E-091.60E-092.70E-126.51E-09
FLA2.38E-092.64E-092.31E-111.31E-08
BWO6.27E-098.07E-092.70E-122.73E-08
RSO2.90E-035.50E-031.83E-082.42E-02
SMA8.04E-098.90E-093.07E-102.73E-08
IGWO4.31E-98.74E-092.70E-122.73E-08
Table 13. Optimization results of piston rod design.
Table 13. Optimization results of piston rod design.
AlgorithmOptimal VariableMinimum Cost
x1x2x3x4
FLAS0.05000.79431.5876500.00008.3410
PSO_ELPM0.05000.79361.5854500.00008.3630
HHO0.05001.08912.1742264.87289.3236
DE0.05002.88642.1166417.22317.9374
SCA0.05300.79861.5862500.00008.5800
FLA0.05000.79861.5969500.00008.5290
BWO0.05000.82461.5942494.81719.0300
RSO0.05001.77892.1002285.063230.076
SMA0.05000.79231.5846500.00008.6250
IGWO0.05090.79371.5863499.16328.3710
Table 14. Statistical results of piston rod design.
Table 14. Statistical results of piston rod design.
AlgorithmMeanStdBestWorst
FLAS0.85780.01638.3410.8965
PSO_ELPM6.794632.51438.3630178.9467
HHO260.4833157.12779.3236622.5394
DE325.4745348.31127.93741.47E+03
SCA0.93680.052408.58001.0222
FLA285.8016338.51268.52901.26E+03
BWO1.27E+000.43549.03002.71E+00
RSO117.9851200.938030.07601.02E+03
SMA16.674148.35538.6250159.3020
IGWO31.020174.51838.3710301.0950
Table 15. Results of design problems of gas transmission compressor.
Table 15. Results of design problems of gas transmission compressor.
AlgorithmsVariableOptimal Total Cost
x1x2x3
FLAS53.40451.189824.72682,964,375.810043
PSO_ELPM53.37611.190224.73752,964,378.339311
HHO53.43521.190124.71912,964,375.922331
DE53.62471.190524.62882,964,390.741786
SCA54.30031.192624.9272,964,439.591532
FLA53.76411.191424.65192,964,386.475420
BWO55.00001.197824.57092,964,545.887990
RSO41.56441.147022.28032,977,866.495387
SMA53.44661.190124.71862,964,415.619347
IGWO51.13241.134037.13952,964,380.498417
Table 16. Statistical results of design problems of gas transmission compressors.
Table 16. Statistical results of design problems of gas transmission compressors.
Algorithm\IndexMeanStdBestWorst
FLAS2.96E+0625.49332,964,375.8100432.96450E+06
PSO_ELPM2.97E+063.83E+032,964,378.3393112.978031E+06
HHO2.96E+0616.26422,964,375.9223312.96453E+06
DE2.96E+06177.00092,964,390.7417862.96535E+06
SCA2.96E+06340.44502,964,439.5915322.96563E+06
FLA2.96E+06171.88342,964,386.4754202.96534E+06
BWO2.97E+067.17E+032,964,545.8879903.00365E+06
RSO3.05E+066.57E+042,977,866.4953873.27322E+06
SMA2.96E+065.94E-052,964,415.6193472.96437E+06
IGWO2.96E+06138.00042,964,380.4984172.96494E+06
Table 17. PVD results.
Table 17. PVD results.
AlgorithmsDesign VariableFitness Value
e1e2e3e4
FLAS0.78730.392140.9869191.09795925.7
PSO_ELPM0.79840.402542.6198170.44826075.4
HHO0.85900.427044.8368145.42126079.1
DE0.95930.425349.2750111.99536805.2
SCA1.29940.254365.201310.46539012.2
FLA0.80530.401940.9875193.49836100.4
BWO0.75530.409940.6985197.03766095.6
RSO0.72410.342353.295281.311310,771.0
SMA0.77820.384740.3196199.99975985.3
IGWO0.80100.397941.5499183.67575937.4
Table 18. PVD statistics.
Table 18. PVD statistics.
AlgorithmsMeanStdBestWorst
FLAS6846.7575.84695925.77519.7
PSO_ELPM6883.7528.16086075.47788.2
HHO6702.3464.13326079.17507.1
DE9993.31622.30006805.215,169.0
SCA11,044.01094.50009012.211,697.0
FLA19,821.032,279.00006100.4170,480.0
BWO6622.8309.98416095.67281.4
RSO53,919.082,523.000010,771.0375,210.0
SMA6021.0361.16135985.37317.8
IGWO7036.91037.40005937.49440.7
Table 19. Stepped cone pulley problem.
Table 19. Stepped cone pulley problem.
AlgorithmVariableMinimum Weight
x1x2x3x4x5
FLAS38.3843952.8589170.507684.516890.000016.5296
PSO_ELPM20.537328.297850.984384.543290.000016.8305
HHO20.747428.527250.808784.514989.979516.8268
DE18.151428.288350.829284.941490.000016.9537
SCA18.911028.621551.644786.158290.000017.1977
FLA39.028629.596520.170849.09770.054017.0044
BWO20.336028.784251.023384.859590.000016.9353
RSO45.507555.732053.741175.400288.204117.1085
SMA20.542328.257650.796984.495989.999816.8002
IGWO2.938613.122332.986989.735988.175916.9241
Table 20. Statistical results of the stepped cone pulley problem.
Table 20. Statistical results of the stepped cone pulley problem.
AlgorithmMeanStdBestWorst
FLAS19.30222.422616.529626.9317
PSO_ELPM10.12450.425216.830511.5840
HHO9.97680.160716.826810.5558
DE10.25800.164116.953710.6296
SCA10.95840.301817.197711.6602
FLA0.24740.243217.00441.0666
BWO10.11190.117216.935310.3983
RSO2.57E+031.35E+0317.10856.17E+03
SMA9.80074.57E-0416.80029.8026
IGWO10.42450.217816.924110.8051
Table 21. Comparison and analysis of SDM results.
Table 21. Comparison and analysis of SDM results.
I ParameterSDM
LbUb
Iph01
Isd01
Rs00.5
Rsh0100
n12
Table 22. Experimental results of FLAS algorithm on SDM.
Table 22. Experimental results of FLAS algorithm on SDM.
Serial NumberActual DataAlgorithmic Estimation Data
IVPImIAEIIAEP
10.7640−0.2057−1.572E-017.64E-011.51E-043.10E-05
20.7620−0.1291−9.837E-027.62E-014.91E-046.34E-05
30.7605−0.0588−4.472E-027.61E-017.44E-044.38E-05
40.76050.00574.335E-037.60E-014.00E-042.28E-06
50.76000.06464.910E-027.59E-019.47E-046.12E-05
60.75900.11858.994E-027.58E-019.14E-041.08E-04
70.75700.16781.270E-017.57E-011.76E-042.95E-05
80.75700.21321.614E-017.56E-017.40E-041.58E-04
90.75550.25451.923E-017.55E-012.70E-046.88E-05
100.75400.29242.205E-017.54E-011.85E-045.41E-05
110.75050.32692.453E-017.52E-011.02E-033.35E-04
120.74650.35852.676E-017.47E-019.38E-043.36E-04
130.73850.38732.860E-017.40E-011.61E-036.22E-04
140.72800.41373.012E-017.27E-017.01E-042.90E-04
150.70650.43733.090E-017.07E-012.47E-041.08E-04
160.67550.45903.101E-016.75E-014.87E-042.23E-04
170.63200.47843.023E-016.31E-011.40E-036.71E-04
180.57300.49602.842E-015.72E-011.13E-035.61E-04
190.49900.51192.554E-014.99E-014.17E-042.14E-04
200.41300.52652.174E-014.14E-015.80E-043.05E-04
210.31650.53981.708E-013.17E-019.40E-045.07E-04
220.21200.55211.170E-012.12E-013.90E-042.15E-04
230.10350.56335.830E-021.03E-015.20E-042.93E-04
24−0.01000.5736−5.736E-03−9.12E-038.82E-045.06E-04
25−0.12300.5833−7.175E-02−1.24E-011.48E-038.64E-04
26−0.21000.5900−1.239E-01−2.10E-014.75E-042.80E-04
Table 23. Comparison of different algorithms on SDM.
Table 23. Comparison of different algorithms on SDM.
AlgorithmFLASFLADMOAIGWOISSACSASCHOBWOTSA
Iph0.7606890.7603850.7605560.7507630.7605280.7606080.7604240.7628180.762663
Io0.5052670.3232820.4372450.7395410.2393030.3106930.5407560.4540390.405659
Rsh66.00936867.47357960.40728758.06596552.28836452.88983873.90107838.75767967.221903
Rs0.0343570.0374410.0348490.0300980.0377800.0310660.0338810.0374270.036881
n1.5276761.4811111.5123981.5710481.4514531.6045631.5348081.5184591.504539
RMSE1.096E-031.396E-031.871E-031.625E-033.852E-032.051E-031.242E-035.030E-032.455E-03
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yan, J.; Hu, G.; Zhang, J. Multi-Strategy Boosted Fick’s Law Algorithm for Engineering Optimization Problems and Parameter Estimation. Biomimetics 2024, 9, 205. https://doi.org/10.3390/biomimetics9040205

AMA Style

Yan J, Hu G, Zhang J. Multi-Strategy Boosted Fick’s Law Algorithm for Engineering Optimization Problems and Parameter Estimation. Biomimetics. 2024; 9(4):205. https://doi.org/10.3390/biomimetics9040205

Chicago/Turabian Style

Yan, Jialing, Gang Hu, and Jiulong Zhang. 2024. "Multi-Strategy Boosted Fick’s Law Algorithm for Engineering Optimization Problems and Parameter Estimation" Biomimetics 9, no. 4: 205. https://doi.org/10.3390/biomimetics9040205

Article Metrics

Back to TopTop