Next Article in Journal
Experimental Characterization of Flame Structure and Soot Volume Fraction of Premixed Kerosene Jet A-1 and Surrogate Flames
Previous Article in Journal
Changes in Abrasive Wear Resistance during Miller Test of High-Manganese Cast Steel with Niobium Carbides Formed in the Alloy Matrix
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Niching Grey Wolf Optimizer for Multimodal Optimization Problems

1
Department of Chemical Engineering, Universiti Teknologi PETRONAS, Seri Iskandar 32610, Perak Darul Ridzuan, Malaysia
2
Department of Information Systems, College of Technological Innovation, Abu Dhabi Campus, Zayed University, Abu Dhabi P.O. Box 144534, United Arab Emirates
3
Centre for Process Systems Engineering, Institute of Autonomous System, Universiti Teknologi PETRONAS, Seri Iskandar 32610, Perak Darul Ridzuan, Malaysia
4
Department of Computer Science, College of Computers and Information Technology, Taif University, Taif 21944, Saudi Arabia
5
Department of Fundamental and Applied Sciences, Universiti Teknologi PETRONAS, Seri Iskandar 32610, Perak Darul Ridzuan, Malaysia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(11), 4795; https://doi.org/10.3390/app11114795
Submission received: 8 April 2021 / Revised: 2 May 2021 / Accepted: 4 May 2021 / Published: 24 May 2021
(This article belongs to the Section Computing and Artificial Intelligence)

Abstract

:
Metaheuristic algorithms are widely used for optimization in both research and the industrial community for simplicity, flexibility, and robustness. However, multi-modal optimization is a difficult task, even for metaheuristic algorithms. Two important issues that need to be handled for solving multi-modal problems are (a) to categorize multiple local/global optima and (b) to uphold these optima till the ending. Besides, a robust local search ability is also a prerequisite to reach the exact global optima. Grey Wolf Optimizer (GWO) is a recently developed nature-inspired metaheuristic algorithm that requires less parameter tuning. However, the GWO suffers from premature convergence and fails to maintain the balance between exploration and exploitation for solving multi-modal problems. This study proposes a niching GWO (NGWO) that incorporates personal best features of PSO and a local search technique to address these issues. The proposed algorithm has been tested for 23 benchmark functions and three engineering cases. The NGWO outperformed all other considered algorithms in most of the test functions compared to state-of-the-art metaheuristics such as PSO, GSA, GWO, Jaya and two improved variants of GWO, and niching CSA. Statistical analysis and Friedman tests have been conducted to compare the performance of these algorithms thoroughly.

1. Introduction

The area of nature-inspired metaheuristic algorithms is continuously evolving with newly developed algorithms. These algorithms are well known for their widespread local and global search ability, local optima avoidance, and quick convergence ability. In addition, these algorithms do not need any knowledge of a function’s gradient or differentiability [1,2]. Since the metaheuristic algorithms have these superior properties with easy applicability and fewer parameter requirements, significant variants of these algorithms are developed [3]. The most widely used algorithms are Simulated Annealing (SA) [4], Genetic Algorithm (GA) [5], Particle Swarm Optimization (PSO) [6], Differential Evolution (DE) [7], and Ant Colony Optimization (ACO) [8]. A few recent metaheuristics that seek the attention of researchers is Bee Collecting Pollen Algorithm (BCPA) [9], Black Hole (BH) [10] Algorithm, Artificial Bee Colony Algorithm (ABC) [11], Central Force Optimization (CFO) [10], Electro-search Algorithm (ESA) [12], Gravitational Search Algorithm (GSA) [13], Teaching Learning Based Algorithm (TLBO) [14], Artificial Chemical Reaction Optimization Algorithm (ACROA) [15], Firefly Algorithm (FA) [16], Big Bang–Big Crunch (BBBC) [17], Grey Wolf Optimizer (GWO) [18], Dragon Fly Algorithm [19], and Harris Hawk Optimizer (HHO) [20], Jaya Algorithm [21]. Among these recent metaheuristics, the GSA, GWO, and Jaya Algorithm is well known for simplicity, effective exploration and exploitation.
Among the algorithms mentioned above, the GWO is a novel metaheuristic algorithm that falls under swarm intelligence [18]. This algorithm has been proposed mimicking the organizational order of wolves and their collective hunting nature. Due to its simplicity, fast seeking speed, good precision for search, and fewer controlling parameters, this algorithm is successfully applied to various technical fields such as feature selection [22,23,24], wind speed forecasting [25], optimal power flow [26,27], pattern recognition [28], unit commitment [29,30] parameters estimation [31,32], wellbore trajectory optimization [31]. In addition to these applications, the GWO and variants also performed well for combinatorial optimization problems and successfully applied to different fields, such as vehicle routing problem [32], job scheduling problems [33,34,35], and economic dispatch [36,37] problems. Some modifications to the GWO algorithm and its application in various optimization fields are now described.
Quasi-oppositional based learning (Q-OBL) GWO called QOGWO was proposed by Guha et al. [38] to control power systems load frequency. The authors designed optimal PID controller by using QOGWO for two areas and four hydrothermal power plants and showed that their proposed methods achieved promising performance like fuzzy logic, ANN, and ANFIS system. In a different research, crossover and mutation-based hybrid GWO was proposed by Jayabarathi et al. [36] to optimize non-linear, non-convex, constrained, and unconstrained economic dispatch problems at different conditions. The authors claimed that their method showed promising performance. Korayem et al. hybridized the GWO with K-means clustering technique to solve CVRP (capacitated vehicle routing problems) and named it as K-GWO [32]. The authors further integrated a capacity constraint to their algorithm and proposed 2 clustering techniques. They follow the cluster-first route-second method to solve CVRP problems and tested their proposed technique for 20 benchmarks, and the obtained solutions are within a mean deviation of 1.76% from the known optimum solutions. Singh and Singh [39] proposed a hybridization of GWO with SCA in 2017. The hybrid GWOSCA shows good performance for unimodal functions but failed to perform better for multi-modal problems [39]. Jiang and zhang applied a hybrid Grey Wolf Optimizer algorithm for job shop scheduling and flexible job shop scheduling problems. Their hybrid algorithm is incorporated with crossover operator to handle discrete variables, whereas mutation operator helps to avoid local optima, and a neighborhood search is applied to increase exploration. Their proposed algorithm outperforms PAGA, LSGA, TLBO, EDA, BFO algorithms for both case studies [40]. Long et al. [41] recommnded exploration-enhanced GWO (EEGWO), in which they introduced a new formula for updating the position by selecting a random wolf to guide the search and non-linearly controlling the GWO parameter “a”. Even after making these modifications, their proposed algorithm still suffers from premature convergence and local optima. Dhargupta et al. [42] designed the selective opposition-based learning GWO (SOGWO) to improve GWO’s convergence ability. To govern the omega wolves, they used Spearman’s correlation coefficient and chose a few dimensions of wolves to apply opposition, which enhanced the exploration ability of the algorithm. A recent variation of the GWO is developed by Shahraki et al. [43], where they suggested a new movement technique encouraged by the discrete hunting nature of wolves called dimension learning-based hunting (DLH) search. According to the DLH, all wolves create their neighbors, sharing information among themselves.
The literature mentioned above indicates that due to easy implementation and less parameter tuning, GWO variants are already applied in every optimization field and have shown good performances empirically. However, premature convergence and the inability to balance the exploitation with exploration are still the main drawbacks of the GWO and its variants for solving complex multi-modal optimization problems [42,43]. To overcome these limitations, we make a decent attempt to improve the GWO algorithm by adapting the niching technique. Niching techniques are well known to trace multiple optimal solutions together. These techniques increase population diversity and reduce the likelihood of being trapped in local solutions. In one iteration, they are also able to trace numerous local/global optima. A few standard niching techniques developed in recent years are clustering [44], fitness sharing [44], crowding [45], and clearing [46]. Nevertheless, niching methods need more function evaluations to reach the global optima. It has been seen from literature that various niching methods are applied to different types of metaheuristic algorithms such as GA [44,45,46], PSO [2,47,48,49], CSA [50], and niching technique has significantly improved the performance of these optimization algorithms. Inspired by these results, we incorporate a fitness Euclidean distance ratio (FER) based niching technique with modified velocity update and local search into the GWO algorithm, which has shown to significantly improve the basic GWO algorithm’s accuracy. Our proposed method also does not need any tuning parameters, which is a significant advantage. The research’s key contribution is to a) integrate FER niching technique into the GWO algorithm, b) modify the velocity update equation, and c) apply a local search technique at the end of the search process. To the best of our knowledge, this study first incorporates the niching technique to the GWO algorithm. Since the multi-modal function optimization also needs to allocate the desired global optima more precisely, we further incorporated the personal best feature of PSO to hold the good solutions. The remaining part of the article is organized as follows: Section 2 describes the NGWO algorithm. Section 3 illustrates the results and engineering applications of NGWO. Finally, Section 4 presents the conclusion.

2. Materials and Methods

2.1. Grey Wolf Optimizer

Mirjalili et al. [18] suggested GWO, a novel population-based algorithm, which falls under the swarm intelligence category. The key feature of GWO that makes it less prone to local solutions and better than other algorithms is that it searches for the best solutions by following three leader wolves: alpha, beta, and delta, while most other algorithms only follow one best solution. The common wolves are called omega, and they always follow the three leaders. The major steps of GWO are:
  • Tracking, chasing, proceeding towards the target;
  • Chasing, encompassing, distressing the target;
  • Attacking the target.
The encircling mechanism can be mathematically represented as:
X   k + 1 = X   P k A   .   C   . X   P k X   k
Here, vector X indicates the position of wolves, X p represents optimal solution’s position, C and A are acceleration coefficients, C = 2 . r 2 and A = 2 a .   r 1 a , where r 1 , and r 2 are randomly generated vectors within [0,1]. a is a constant vector that declines linearly from 2 to 0. All the wolves update their positions by following these equations.
X a = X   α   A   .     C 1 . X   α   X    
X   b =   X   β   A   .     C 2 .   X   β   X    
X   c = X   δ   A   .     C 3 . X   δ   X    
X   = X   a   + X   b + X c 3
At each iteration, the wolves will update their positions according to Equations (1)–(5). The parameter a controls the movement of the wolves during the first half of the iteration period where a > 1, and it will intensify the exploration ability of the algorithm. In the last half of the iteration period where a < 1, it will increase the wolves’ exploitation ability.

2.2. Niching Technique

Niching techniques have the superiority of containing multiple solutions, which help them to handle the multi-modal functions and effectively reach the global optimal basin. In the proposed technique, the fitness Euclidean-distance ratio GWO (FER-GWO) is employed to locate the best closest solutions, and the FER-GWO is developed following the concept of FDR-PSO [51]. To calculate the FER value between two wolves i and j, the following equation has been utilized.
FER ij = α .   F p i F p j   p i   p j
Here, pi and pj are the positions of the corresponding i and j wolves, and F(pi) and F(pj) represent their cost function values.
The   scaling   factor   α   =   S F p b F p w
In the above equation F(pw) and F(pb) are the cost function values of the worst and best wolves among the current population, and the search space size is calculated based on decision variables bounds, and numerically represented as:
S = n = 1 D ( x n u x n l ) ^ 2
Since, multi-modal optimization needs extensive exploration, niching is an efficient technique that could meet the requirement. Therefore, in the proposed GWO algorithm, we incorporated the personal best features to keep track of the best solutions discovered so far during the search process. The wolves will be guided towards more fittest neighborhood points at each iteration that can be obtained by calculating the FER values. The FER value has superiority in that it does not need parameters specification. Moreover, bigger population size is recommended for niching to locate all the local and global optima.

2.3. Modified Velocity Update Equation

During the iteration period, we determine the nbesti of each wolf based on the FER value from Equation (6). The nbesti value is further used to modify the velocity update Equation of beta and delta wolves, which is formulated as:
X b = nbest i A .   C .   nbest i X  
X   c = nbest i A   .   C   .   nbest i X    
We set a niching constant (NC) as 0.5 and generate a random number (ri) during each iteration, if ri > NC, then the beta and delta wolves will be guided by nbesti using Equations (9) and (10). Otherwise, the original Equations (3) and (4) will be used to guide the beta and delta wolves. Please note that the alpha wolf is always guided by the original position update equation (Equation (2)).

2.4. Local Search

It is challenging for the niching techniques to hit the global optima accurately in complex multi-modal problems reliably. We incorporated a local search technique into our NGWO to generate offspring close to the personal best (Pbest). It also improves the algorithm’s fine search capability, which increases the likelihood of obtaining optimal results. The definition of Qu et al. [2] is used to establish the local search technique, and the pseudo-code is provided below in Algorithm 1.
Algorithm 1 Local Search
1: Update the present Pbest by NGWO
2: for i = 1 to NP (number of wolves)
3:  Search Pbest_nearesti (the nearest Pbest member to Pbesti)
4:  if fit(Pbest_nearesti) < = fit(Pbesti)
5:   Temp = Pbesti + 1.5 * rand * (Pbest_nearesti–Pbesti)
6:  else
7:   Temp = Pbesti + 1.5 * rand * (Pbesti–Pbest_nearesti)
8:  end if
9:  Check for bounds violation, and assess Temp.
10:  if fit(Temp) < fit(Pbesti)
11:   Pbesti = Temp
12:  end if
13: end for
During each iteration of the NGWO, the niching technique is applied by using Equations (6)–(8). The wolves update their positions based on Equations (2)–(5) and (9)–(10). Finally, local search is applied to the wolves’ personal best (Pbest). Figure 1 shows a detailed flow map of the proposed algorithm.

3. Result and Discussion

3.1. Experimental Setup

We evaluated the proposed algorithm’s performance comprehensively for 23 test functions as a minimization problem [13,18]. These functions are very common for testing any algorithm’s effectiveness, consisting of 7 unimodal, 6 multi-modal, and 10 fixed dimensional multi-modal functions. The population and iteration number are set as 50 and 1000 for the proposed method, respectively. We keep these two features similar to the other compared studies to make the comparison fair and non-biased. The compared PSO, GSA, GWO, and SOGWO values are taken from [42], and niching CSA values are inserted from [50]. The Matlab code of Jaya algorithm and IGWO algorithms are collected from the authors website for comparison. The test functions are numerically presented here, and their detailed properties are described in Table 1.
F 1 y = j = 1 D y j 2  
F 2 y = j = 1 D |   y j | + j = 1 D y j
F 3 y = j = 1 D   j = 1 i y j 2  
F 4 y = y , 1 j n  
F 5 y = j = 1 D 1 100 × y j 2 y j + 1 2 + y j   1 2    
F 6 y = j = 1 D y j + 0.5 ^ 2
F 7 y = j = 1 D jy j 4 + random 0 , 1
F 8 y = j = 1 D y j   sin y j
F 9 y = j = 1 D y j 2 10 ×   cos 2 π y j + 10
F 10 y = 20 × expexp   0.2 ×   1 D   j = 1 D y j 2   expexp   1 D j = 1 D cos 2 π y j   + 20 + e
F 11 y = j = 1 D   y j 2 4000 j = 1 D cos ( y j j ) + 1  
F 12 y = π n π x + j = 1 D y j 1 2 1 + 10 π x j + 1 + x n 1 2 + j = 1 D u y i ,   10 ,   100 , 4  
x j = 1 + y j + 1 4   ,   u y j ,   a , k , m = k y j a ,   if   y j > a ; or   0 ,   if a < y j < a ; or   k y j a m ,   if   y j < a   F 13 y = 3 π y j + j = 1 D y j 1 2 1 + 3 π y j + 1 + y n 1 2   1 + 2 π y n } + j = 1 D u y j ,   5 ,   100 , 4  
F 14 y =   1 500 + j = 1 25 1 i + i = 1 2 ( y i a i , j )   1
F 15 y = i = 1 11   a i y i b i 2 + b i y 2 b i 2 + b i y 3 + y 4 2  
F 16 y = 4 y 1 2 2.1 y 1 4 + 1 3 y 1 6 + y 1 y 2 4 y 2 2 + 4 y 2 4  
F 17 y = ( y 2 5.1 4 π 2 y 1 2 + 5 π y 1 6 ) + 10   1 1 / 8 π cos y 1 + 10
F 18 y = 1 + y 1 + y 2 + 1 2   19 14 y 1 + 3 y 1 2 14 y 2 + 6 y 1 y 2 + 3 y 2 2 × 30 + 2 y 1 3 y 2 2 × 18 32 y 1 + 12 y 1 2 + 48 y 2 36 y 1 y 2 + 27 y 2 2
F 19 y = i = 1 4 c i exp ( j = 1 3 a i , j   y j p i , j   2 )
F 20 y = i = 1 4 c i exp ( j = 1 6 a i , j   y j p i , j   2 )
F 21 y = i = 1 5 y a i y a i T + c i 1
F 22 y = i = 1 7 y a i y a i T + c i 1
F 23 y = i = 1 10 y a i y a i T + c i 1
Table 2 displays the parameter configurations of the compared algorithms. For each test function, we run the algorithms 30 times in MATLAB, and the performance indexes (mean, standard deviation (SD), minimum (min) values) and Friedman test results are collected from the MATLAB and inserted in Table 3, Table 4 and Table 5, with the best values bolded.

3.2. Classical Benchmark Functions

3.2.1. Exploitation Analysis

The unimodal functions (F1–F7) are well recognized for evaluating algorithm exploitation ability. The performance indexes of unimodal functions are reported in Table 3. The proposed NGWO shows the superior result for the considered unimodal functions and outperforms all other algorithms for five unimodal functions (F1, F2, F4, F5, F7). Nonetheless, the NCSA algorithm performs better for one unimodal function (F3), and GSA achieved the best indexes for one unimodal function (F6).

3.2.2. Exploration Analysis and Local Optima Avoidance

The multi-modal functions are renowned for having multiple local optima, where the number of local optima increases exponentially with dimensions. Therefore, the multi-modal functions (F8–F23) are suitable for testing the exploration and local optima avoidance ability of an algorithm. The NGWO algorithm shows very promising results for the multi-modal functions. As seen from Table 4, the NGWO algorithm outperforms all other algorithms and accomplished all the best indexes for nine functions (F8, F9, F11, F16–F20, F23). Moreover, the NGWO algorithm found the global optima of ten multi-modal functions with pinpoint precision (F9, F11, F16–F23). The NGWO algorithm achieved the best SD indexes for a total of 14 functions, indicating the algorithm’s high local optima avoidance ability. The lowest SD values, therefore, mean that there is less variation in the algorithm’s optimum values, implying that the algorithm is stable.

3.2.3. Convergence Analysis

This paragraph illustrates the convergence ability of the NGWO and other compared algorithms. Figure 2 shows the convergence characteristics of the unimodal benchmarks, while Figure 3 depicts the convergence behavior of the six multi-modal benchmarks. In Figure 2 and Figure 3, the horizontal axis signifies the iteration number, and vertical axis shows the median function values from 30 runs. In all Figures, the NGWO maintains a consistent downward slope through the iteration periods, which indicates that the wolves are sharing information to improve the cost function values (fitness) through the iteration period. For most of the convergence curves of Figure 2 and Figure 3, the NGWO shows superior convergence ability that also indicates a proper balance between exploitation and exploration. Furthermore, the NGWO algorithm only took 100 iterations to achieve the exact global optima zero for multi-modal functions F9 and F11, demonstrating NGWO’s rapid convergence speed.
The parameter space, quest history, trajectory of the first wolf, convergence curves, and wolves average fitness are plotted in Figure 4 and Figure 5 to illustrate the convergence behavior in detail. This section has considered the number of wolves and iteration as 50 and 100, respectively. It has been known from the literature [52] that abrupt position changes during the initial iteration steps are more preferable for improving the exploration ability of an algorithm. These changes need to be decreased to strengthen the exploitation ability in the later part of the iteration. The trajectory and search history of the 1st wolf are demonstrated in Figure 4 and Figure 5. It is explicit from the first wolf’s search history (second column of Figure 4 and Figure 5) that the wolves extensively searched through the search space and exploited the best area. The trajectory of 1st wolf (3rd column in Figure 4 and Figure 5) also indicates the same phenomenon, where it emphasized more on exploration in the initial steps, and it decreased through the iteration period. By contrast, the exploitation ability is strengthened during the last periods of iteration. All of these indicate that the NGWO has good convergence ability, and it achieves a proper balance between local exploitation and global exploration.
Figure 6 and Figure 7 show the box plots for each algorithm for multi-modal functions. The first six multi-modal functions are 30-dimensional problems (F7–F13), while the next ten are fixed dimensional multi-modal problems, and the dimension numbers are reported in Table 1. In MATLAB, we run each algorithm 30 times for each test function, saving the best values from each run and plotting them as box plots. These box plots illustrate the distribution of benchmark function results in a better way and indicates the dispersion and symmetry of data sets. The box plots represent the first, second (median), third and lower quartiles, upper bounds and outliers, respectively. The bigger size of the rectangular box indicates that the data sets are scattered more, and the results obtained by the algorithm are not stable. It also indicates the comparatively worst performance. The red crossbar indicates the median of data sets, and the plus sign (+) in red indicates outliers. Among the plots of Figure 6, the box plot location of the NGWO is lower than the other seven algorithms for four functions (F8–F11, Figure 6a–d), and the box size is also lowest for these functions. In Figure 6b,d the NGWO algorithm reached the exact zero for all data sets, that is why there is no box for NGWO in these two figures. In the case of Figure 7a–j, the NGWO shows the minimum distribution of the data sets and its box plots lies below that of other algorithms. These things indicate the superiority of the proposed NGWO algorithm and less variation in results, and stability of the algorithm.

3.2.4. Ranking of the Algorithms

To analyze the NGWO algorithm’s performance more thoroughly, we conducted non-parametric Friedman tests to rank each algorithm. This multiple comparison test works based on the recurrent measurement of the Anova, and it can determine the differences among the compared algorithms. The best values of each algorithm for 30 runs were stored in a matrix, and the Friedman test was applied to this matrix to obtain the mean ranks of each algorithm for 30 runs. The Friedman test also provides the corresponding p-values for each test function. It is explicit from the Friedman test results in Table 5 that the proposed NGWO attained the best rank for 16 functions among a total of 23 benchmark functions. Additionally, the JAYA, GSA, and PSO obtained the best ranking for 3, 2, and 1 functions, respectively.Very low p–Values indicate that the Friedman test results are statistically significant.

3.3. Engineering Application of NGWO

The proposed algorithm has been tested for three real-life engineering case studies and collected from the literature [18,43]. All three problems have inequality constraints, and we use the hard penalty approach to handle the constraints. We have considered 100 wolves over 100 iterations to solve these three problems.

3.3.1. Welded Beam Design

The objective of this case study is to minimize a welded beam’s fabrication cost. It consists of four decision variables such as bar thickness (b), bar height (t), attached part length (l), and weld thickness (h), and seven inequality constraints (sheer and bending stress, side constraints, end deflection, buckling load). The problem is mathematically formulated as [43]:
f y = 1.10471 y 1 2 y 2 + 0.04811 y 3 y 4 14 + y 2
τ y =   τ 1 2 + τ 1   τ 2 y 2 2 R + τ 2 2
  τ 1 = P 2 y 1 y 2
τ 2 = MR J
M = P L + y 2 2
R = 0.25 * y 2 2 + y 1 + y 3   2  
J = 2 / 2 y 1 y 2 y 2 2 12   + ( y 1 + y 3 ) ^ 2   4
σ y = 6 PL x 4 y 3 2
δ y = 4 PL 3 Ey 4 y 3 3
P c y = 4.013 E   y 3 y 4 3   6 L 2   ( 1 0.25 y 3 E G / L )
Here, P = 6000 l b ,     L = 14   i n c h ,   δ m a x = 0.25   i n c h ,     E = 30 × 10 6   p s i ,   G = 12 × 10 6 p s i   ,     τ m a x = 13600   p s i ,     σ m a x = 30000   p s i . The inequality constraints are:
c 1 y =   τ y   τ max 0
c 2 y =   σ y   σ max 0
c 3 y =   y 1 y 4 0
c 4 y = 1.10471 y 1 2 + 0.04811 y 3 y 4 14 + y 2 5 0
c 5 y =   δ y   δ max 0
c 6 y =   p p c y 0
c 7 y = 0.125 y 1 0
The decision variable bounds are:
0.1 y 1 2 ;   0.1 y 2 10   ;   0.1 y 3 10 ; 0.1 y 4 2

3.3.2. Tension–Compression Coil Spring Design

The weight of the tension–compression coil is reduced in this case study. It consists of three decision variables ((number of coils (n), wire diameter (d), mean coil diameter (D)), and 4 inequality constraints. The following is a numerical representation of the problem:
Minimize f y = y 3 + 2   y 1 y 2 2
The inequality constraints are:
c 1 y = 1 y 1 3 y 3 71785 y 2 4 0
c 2 y = 4 y 1 2 y 1 y 2 12566 y 1 y 2 3 y 2 4 + 1 5108 y 2 2 1 0
c 3 y = 1 140.45 y 2 y 1 2 y 3   0  
c 4 y = y 1 + y 2   1.5   1 0
Here, 0.25 y 1 1.30 ;   0.05 y 2 2 ;   2 y 3 15   ;

3.3.3. Three Bars Truss Design

Ray and Saini [53] first proposed this case study to minimize the bar’s weight. It has 2 variables and 3 constraints, and it can be mathematically presented as:
Minimize f y = 2 2   y 1 + y 2 × l
Subject to
2 y 1 + y 2 2 y 1 2 + 2 y 1 y 2 P σ 0
y 2   2 y 1 2 + 2 y 1 y 2   P σ 0
1   2 y 1 y 2 + 2 y 1 2   P σ 0
The variables ranges are: 0 y 1 1   ;   0 y 2 1   and l = 100 cm, P = 2 KN/cm2, and σ = 2 KN/cm2

3.4. Results of the Engineering Application

3.4.1. Optimization Result of Welded Beam Design

It is explicit from the reported results shown in Table 6 that the NGWO outperforms all the previous algorithms for this case study. The proposed algorithm achieved optimal values for four indexes among the five compared indexes (best, mean, worst, standard deviation, and NFEs). In contrast, the HEAA algorithm achieved the best result for the standard deviation index.

3.4.2. Result of Tension–Compression Coil Spring Design

For tension–compression coil spring design, we compared our findings with 10 other previous studies from the literature in Table 7. The NGWO algorithm achieved the optimal values for four indexes, while the FSA algorithm attained the optimal value for SD index.

3.4.3. Three Bars Truss Design Result

Table 8 summarizes the results of three bars truss design problem. The WCA and DEDS algorithms achieved three and two best indexes, respectively. The NGWO algorithm’s results are very similar to the previous best results obtained by various optimization algorithms.

4. Conclusions

The GWO algorithm was influenced by the leadership structure and group hunting system of wolves. The GWO algorithm undergoes the lack of population diversity and is vulnerable to premature convergence in multi-modal problems. Hence, a new variant of GWO called niching GWO (NGWO) with a modified position update equation is proposed in this research to improve the performance of the GWO algorithm for solving multi-modal problems. The niching techniques are well known for handling multiple optimal solutions in the case of multi-modal optimization problems. A fitness Euclidean distance ratio-based niching GWO called FER-GWO is used to strengthen the searchability of the algorithm. By using the FER technique, the wolves are guided towards the nearby wolves’ best position, having better cost function values.
The proposed NGWO algorithm has been thoroughly tested on 23 traditional benchmark functions and three real-world engineering cases. It achieves very promising results and outperforms most other compared algorithms for the considered benchmarks and engineering case studies. This superiority is accomplished by incorporating the niching technique to the basic GWO algorithm and modifying the position update equation, whereas the local search increases the fine searchability. The reported results indicate that the algorithm achieved well-balanced global exploration and local exploitation, which further increases the algorithm’s accuracy. This research focuses on single-objective optimization problems only; however, future work will extend the NGWO algorithm to multi-objective optimization problems.

Author Contributions

Conceptualization, R.A. and S.M.; methodology, R.A., S.M., M.S. and A.N.; software, R.A.; validation, A.N., R.A., S.M. and J.I.; formal analysis, R.A., J.I.; investigation, R.A., J.I.; resources, R.A., A.N., M.S.; data curation, R.A.; writing—original draft preparation, R.A. S.M.; writing—review and editing, A.N., M.S., J.I.; visualization, R.A., J.I.; supervision, S.M., A.N.; project administration, A.N.; S.M., M.S.; funding acquisition, A.N.; M.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the Zayed University, Abu Dhabi, UAE, and in part by the AnalytiCray Solutions, Kuala Lumpur, Malaysia, and in part by the Taif University Researchers Supporting Project number (TURSP-2020/79), Taif University, Taif, Saudi Arabia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Xu, X.; Tang, Y.; Li, J.; Hua, C.; Guan, X. Dynamic multi-swarm particle swarm optimizer with cooperative learning strategy. Appl. Soft Comput. 2015, 29, 169–183. [Google Scholar] [CrossRef]
  2. Qu, B.; Liang, J.; Suganthan, P. Niching particle swarm optimization with local search for multi-modal optimization. Inf. Sci. 2012, 197, 131–143. [Google Scholar] [CrossRef]
  3. Al Amin, M.; Abdul-Rani, A.M.; Ahmed, R.; Rao, T.V.V.L.N. Multiple-objective optimization of hydroxyapatite-added EDM technique for processing of 316L-steel. Mater. Manuf. Process. 2021, 36, 1–12. [Google Scholar] [CrossRef]
  4. Holland, J.H. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence; The MIT Press: Cambridge, MA, USA, 1975. [Google Scholar]
  5. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  6. Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the Sixth International Symposium on Micro Machine and Human Science, MHS’95, Nagoya, Japan, 4–6 October 1995; pp. 39–43. [Google Scholar] [CrossRef]
  7. Storn, R.; Price, K. Differential evolution—A simple and efficient adaptive scheme for global optimization over continuous spaces. J. Glob. Optim. 1995, 23, 1–15. [Google Scholar]
  8. Dorigo, M.; Gambardella, L. Ant colony system: A cooperative learning approach to the traveling salesman problem. IEEE Trans. Evol. Comput. 1997, 1, 53–66. [Google Scholar] [CrossRef] [Green Version]
  9. Lu, X.; Zhou, Y. A novel global convergence algorithm: Bee collecting pollen algorithm. In Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence. ICIC 2008. Lecture Notes in Computer Science; Huang, D.S., Wunsch, D.C., Levine, D.S., Jo, K.H., Eds.; Springer: Berlin/Heidelberg, Germany, 2008; Volume 5227. [Google Scholar] [CrossRef]
  10. Formato, R.A. Central force optimization: A new metaheuristic with applications in applied electromagnetics. Prog. Electromagn. Res. 2007, 77, 425–491. [Google Scholar] [CrossRef] [Green Version]
  11. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  12. Tabari, A.; Ahmad, A. A new optimization method: Electro-Search algorithm. Comput. Chem. Eng. 2017, 103, 1–11. [Google Scholar] [CrossRef]
  13. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  14. Rao, R.; Savsani, V.; Vakharia, D. Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  15. Alatas, B. ACROA: Artificial Chemical Reaction Optimization Algorithm for global optimization. Expert Syst. Appl. 2011, 38, 13170–13180. [Google Scholar] [CrossRef]
  16. Yang, X.S. Firefly algorithm, stochastic test functions and design optimization. Int. J. Bio Inspired Comput. 2010, 2, 78. [Google Scholar] [CrossRef]
  17. Erol, O.K.; Eksin, I. A new optimization method: Big Bang-Big Crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  18. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  19. Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 2016, 27, 1053–1073. [Google Scholar] [CrossRef]
  20. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Futur. Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  21. Rao, R.V. Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int. J. Ind. Eng. Comput. 2016, 7, 19–34. [Google Scholar] [CrossRef]
  22. Tu, Q.; Chen, X.; Liu, X. Hierarchy strengthened Grey Wolf Optimizer for numerical optimization and feature selection. IEEE Access 2019, 7, 78012–78028. [Google Scholar] [CrossRef]
  23. Al-Tashi, Q.; Rais, H.M.; Abdulkadir, S.J.; Mirjalili, S. Feature selection based on Grey Wolf Optimizer for oil & gas reservoir classification. In Proceedings of the 2020 International Conference on Computational Intelligence (ICCI), Seri Iskandar, Perak, Malaysia, 8–9 October 2020; pp. 211–216. [Google Scholar] [CrossRef]
  24. Emary, E.; Zawbaa, H.M.; Grosan, C.; Hassenian, A.E. Feature subset selection approach by gray-wolf optimization. In Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2015; Volume 334. [Google Scholar] [CrossRef]
  25. Song, J.; Wang, J.; Lu, H. A novel combined model based on advanced optimization algorithm for short-term wind speed forecasting. Appl. Energy 2018, 215, 643–658. [Google Scholar] [CrossRef]
  26. El-Fergany, A.A.; Hasanien, H.M. Single and multi-objective optimal power flow using Grey Wolf Optimizer and differential evolution algorithms. Electr. Power Compon. Syst. 2015, 43, 1548–1559. [Google Scholar] [CrossRef]
  27. Sulaiman, M.H.; Mustaffa, Z.; Mohamed, M.R.; Aliman, O. Using the gray wolf optimizer for solving optimal reactive power dispatch problem. Appl. Soft Comput. J. 2015, 32, 286–292. [Google Scholar] [CrossRef] [Green Version]
  28. Katarya, R.; Verma, O.P. Recommender system with Grey Wolf Optimizer and FCM. Neural Comput. Appl. 2018, 30, 1679–1687. [Google Scholar] [CrossRef]
  29. Kamboj, V.K. A novel hybrid PSO–GWO approach for unit commitment problem. Neural Comput. Appl. 2016, 27, 1643–1655. [Google Scholar] [CrossRef]
  30. Panwar, L.K.; Reedy, S.; Verma, A.; Panigrahi, B.K.; Kumar, R. Binary Grey Wolf Optimizer for large scale unit commitment problem. Swarm Evol. Comput. 2018, 38, 251–266. [Google Scholar] [CrossRef]
  31. Biswas, K.; Vasant, P.M.; Vintaned, J.A.G.; Watada, J. Cellular automata-based multi-objective hybrid Grey Wolf Optimization and particle swarm optimization algorithm for wellbore trajectory optimization. J. Nat. Gas Sci. Eng. 2021, 85, 103695. [Google Scholar] [CrossRef]
  32. Korayem, L.; Khorsid, M.; Kassem, S.S. Using Grey Wolf algorithm to solve the capacitated vehicle routing problem. Conf. Ser. Mater. Sci. Eng. 2015, 2015, 83. [Google Scholar] [CrossRef] [Green Version]
  33. Lu, C.; Xiao, S.; Li, X.; Gao, L. An effective multi-objective discrete grey Wolf optimizer for a real-world scheduling problem in welding production. Adv. Eng. Softw. 2016, 99, 161–176. [Google Scholar] [CrossRef] [Green Version]
  34. Komaki, G.; Kayvanfar, V. Grey Wolf Optimizer algorithm for the two-stage assembly flow shop scheduling problem with release time. J. Comput. Sci. 2015, 8, 109–120. [Google Scholar] [CrossRef]
  35. Asadzadeh, L. A local search genetic algorithm for the job shop scheduling problem with intelligent agents. Comput. Ind. Eng. 2015, 85, 376–383. [Google Scholar] [CrossRef]
  36. Jayabarathi, T.; Raghunathan, T.; Adarsh, B.; Suganthan, P.N. Economic dispatch using hybrid grey wolf optimizer. Energy 2016, 111, 630–641. [Google Scholar] [CrossRef]
  37. Pradhan, M.; Roy, P.K.; Pal, T. Grey Wolf optimization applied to economic load dispatch problems. Int. J. Electr. Power Energy Syst. 2016, 83, 325–334. [Google Scholar] [CrossRef]
  38. Guha, D.; Roy, P.K.; Banerjee, S. Load frequency control of large scale power system using quasi-oppositional Grey Wolf Optimization algorithm. Eng. Sci. Technol. Int. J. 2016, 19, 1693–1713. [Google Scholar] [CrossRef] [Green Version]
  39. Singh, N.; Singh, S. A novel hybrid GWO-SCA approach for optimization problems. Eng. Sci. Technol. Int. J. 2017, 20, 1586–1601. [Google Scholar] [CrossRef]
  40. Jiang, T.; Zhang, C. Application of Grey Wolf Optimization for solving combinatorial problems: Job Shop and Flexible Job Shop scheduling cases. IEEE Access 2018, 6, 26231–26240. [Google Scholar] [CrossRef]
  41. Long, W.; Jiao, J.; Liang, X.; Tang, M. An exploration-enhanced Grey Wolf Optimizer to solve high-dimensional numerical optimization. Eng. Appl. Artif. Intell. 2018, 68, 63–80. [Google Scholar] [CrossRef]
  42. Dhargupta, S.; Ghosh, M.; Mirjalili, S.; Sarkar, R. Selective opposition based Grey Wolf Optimization. Expert Syst. Appl. 2020, 151, 113389. [Google Scholar] [CrossRef]
  43. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S. An improved Grey Wolf Optimizer for solving engineering problems. Expert Syst. Appl. 2021, 166, 113917. [Google Scholar] [CrossRef]
  44. Yin, X.; Germay, N. A Fast genetic algorithm with sharing scheme using cluster analysis methods in multimodal function optimization. In Artificial Neural Nets and Genetic Algorithms; Springer: Cham, Switzerland, 1993. [Google Scholar]
  45. Mahfound, S.W. Crowding and preselection revisited. In Parallel Problem Solving from Nature; Elsevier: Amsterdam, The Netherlands, 1992. [Google Scholar]
  46. Petrowski, A. Clearing procedure as a niching method for genetic algorithms. In Proceedings of the IEEE International Conference on Evolutionary Computation, Nayoya, Japan, 20–22 May 1996. [Google Scholar] [CrossRef]
  47. Li, X. Niching without niching parameters: Particle swarm optimization using a ring topology. IEEE Trans. Evol. Comput. 2010, 14, 150–169. [Google Scholar] [CrossRef]
  48. Li, X. A multimodal particle swarm optimizer based on fitness Euclidean-distance ratio. In Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation—GECCO ’07, London, UK, 7–11 July 2007; pp. 78–85. [Google Scholar] [CrossRef] [Green Version]
  49. Li, X. Adaptively choosing neighbourhood bests using species in a particle swarm optimizer for multimodal function optimization. In Proceedings of the Genetic and Evolutionary Computation Conference 2004, Seattle, WA, USA, 26–30 June 2004; pp. 105–116. [Google Scholar] [CrossRef]
  50. Islam, J.; Rahaman, A.; Vasant, P.; Negash, B.; Hoqe, A.; Alhitmi, H.K.; Watada, J. A modified niching crow search approach to well placement optimization. Energies 2021, 14, 857. [Google Scholar] [CrossRef]
  51. Peram, T.; Veeramachaneni, K.; Mohan, C.K. Fitness-distance-ratio based particle swarm optimization. In Proceedings of the 2003 IEEE Swarm Intelligence Symposium. SIS’03, Indianapolis, IN, USA, 24–26 April 2003. [Google Scholar] [CrossRef]
  52. Bergh, F.V.D.; Engelbrecht, A.P. A study of particle swarm optimization particle trajectories. Inf. Sci. 2006, 176, 937–971. [Google Scholar] [CrossRef]
  53. Ray, T.; Saini, P. Engineering design optimization using a swarm with an intelligent information sharing among individuals. Eng. Optim. 2001, 33, 735–748. [Google Scholar] [CrossRef]
  54. Deb, K. Optimal design of a welded beam via genetic algorithms. AIAA J. 1991, 29, 2013–2015. [Google Scholar] [CrossRef]
  55. Ray, T.; Liew, K. Society and civilization: An optimization algorithm based on the simulation of social behavior. IEEE Trans. Evol. Comput. 2003, 7, 386–396. [Google Scholar] [CrossRef]
  56. Hedar, A.-R.; Fukushima, M. Derivative-free filter simulated annealing method for constrained continuous global optimization. J. Glob. Optim. 2006, 35, 521–549. [Google Scholar] [CrossRef]
  57. Wang, Y.; Cai, Z.; Zhou, Y. Accelerating adaptive trade-off model using shrinking space technique for constrained evolutionary optimization. Int. J. Numer. Methods Eng. 2009, 77, 1501–1534. [Google Scholar] [CrossRef]
  58. Wang, Y.; Cai, Z.; Zhou, Y.; Fan, Z. Constrained optimization based on hybrid evolutionary algorithm and adaptive constraint-handling technique. Struct. Multidiscip. Optim. 2008, 37, 395–413. [Google Scholar] [CrossRef]
  59. Coello, C.A.C.; Montes, E.M. Constraint-handling in genetic algorithms through the use of dominance-based tournament selection. Adv. Eng. Inform. 2002, 16, 193–203. [Google Scholar] [CrossRef]
  60. He, Q.; Wang, L. An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng. Appl. Artif. Intell. 2007, 20, 89–99. [Google Scholar] [CrossRef]
  61. Baykasoğlu, A.; Ozsoydan, F.B. Adaptive firefly algorithm with chaos for mechanical design optimization problems. Appl. Soft Comput. J. 2015, 36, 152–164. [Google Scholar] [CrossRef]
  62. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110, 69–84. [Google Scholar] [CrossRef]
  63. Kumar, V.; Kumar, D. An astrophysics-inspired Grey Wolf algorithm for numerical optimization and its application to engineering design problems. Adv. Eng. Softw. 2017, 112, 231–254. [Google Scholar] [CrossRef]
  64. Zhang, M.; Luo, W.; Wang, X. Differential evolution with dynamic stochastic selection for constrained optimization. Inf. Sci. 2008, 178, 3043–3074. [Google Scholar] [CrossRef]
  65. Eskandar, H.; Sadollah, A.; Bahreininejad, A.; Hamdi, M. Water cycle algorithm—A novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput. Struct. 2012, 110–111, 151–166. [Google Scholar] [CrossRef]
  66. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
  67. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-Verse Optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513. [Google Scholar] [CrossRef]
  68. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
Figure 1. Steps of the NGWO algorithm.
Figure 1. Steps of the NGWO algorithm.
Applsci 11 04795 g001
Figure 2. Convergence characteristics of the algorithms for unimodal benchmark functions (F1–F7). Here, (a) represents benchmark function F1, (b) represents benchmark function F2, (c) represents benchmark function F3, (d) represents benchmark function F4, (e) represents benchmark function F5, (f) represents benchmark function F6, (g) represents benchmark function F7.
Figure 2. Convergence characteristics of the algorithms for unimodal benchmark functions (F1–F7). Here, (a) represents benchmark function F1, (b) represents benchmark function F2, (c) represents benchmark function F3, (d) represents benchmark function F4, (e) represents benchmark function F5, (f) represents benchmark function F6, (g) represents benchmark function F7.
Applsci 11 04795 g002aApplsci 11 04795 g002b
Figure 3. Convergence characteristics of the algorithms for multi-modal benchmark functions (F8-F13). Here, (a) represents benchmark function F8, (b) represents benchmark function F9, (c) represents benchmark function F10, (d) represents benchmark function F11, (e) represents benchmark function F12, (f) represents benchmark function F13.
Figure 3. Convergence characteristics of the algorithms for multi-modal benchmark functions (F8-F13). Here, (a) represents benchmark function F8, (b) represents benchmark function F9, (c) represents benchmark function F10, (d) represents benchmark function F11, (e) represents benchmark function F12, (f) represents benchmark function F13.
Applsci 11 04795 g003
Figure 4. Parameter space, search history, trajectory of 1st wolf, the average fitness of all wolves, and convergence curve of NGWO for unimodal functions (F1–F7). Here, (a) represents benchmark function F1, (b) represents benchmark function F2, (c) represents benchmark function F3, (d) represents benchmark function F4, (e) represents benchmark function F5, (f) represents benchmark function F6, (g) represents benchmark function F7.
Figure 4. Parameter space, search history, trajectory of 1st wolf, the average fitness of all wolves, and convergence curve of NGWO for unimodal functions (F1–F7). Here, (a) represents benchmark function F1, (b) represents benchmark function F2, (c) represents benchmark function F3, (d) represents benchmark function F4, (e) represents benchmark function F5, (f) represents benchmark function F6, (g) represents benchmark function F7.
Applsci 11 04795 g004aApplsci 11 04795 g004b
Figure 5. Parameter space, search history, trajectory of 1st wolf, the average fitness of wolves, and convergence curve of NGWO for multi-modal functions (F8–F13). Here, (a) represents benchmark function F8, (b) represents benchmark function F9, (c) represents benchmark function F10, (d) represents benchmark function F11, (e) represents benchmark function F12, (f) represents benchmark function F13.
Figure 5. Parameter space, search history, trajectory of 1st wolf, the average fitness of wolves, and convergence curve of NGWO for multi-modal functions (F8–F13). Here, (a) represents benchmark function F8, (b) represents benchmark function F9, (c) represents benchmark function F10, (d) represents benchmark function F11, (e) represents benchmark function F12, (f) represents benchmark function F13.
Applsci 11 04795 g005aApplsci 11 04795 g005b
Figure 6. Box plot of the multi-modal benchmark functions (F8–F13). Here, (a) represents F8, (b) represents F9, (c) represents F10, (d) represents F11, (e) represents F12, (f) represents F13.
Figure 6. Box plot of the multi-modal benchmark functions (F8–F13). Here, (a) represents F8, (b) represents F9, (c) represents F10, (d) represents F11, (e) represents F12, (f) represents F13.
Applsci 11 04795 g006
Figure 7. Box plot of the fixed dimensional multi-modal benchmark functions (F14–F23). Here, (a) represents benchmark function F14, (b) represents benchmark function F15, (c) represents benchmark function F16, (d) represents benchmark function F17, (e) represents benchmark function F18, (f) represents benchmark function F19, (g) represents benchmark function F20, (h) represents benchmark function F21, (i) represents benchmark function F22, (j) represents benchmark function F23.
Figure 7. Box plot of the fixed dimensional multi-modal benchmark functions (F14–F23). Here, (a) represents benchmark function F14, (b) represents benchmark function F15, (c) represents benchmark function F16, (d) represents benchmark function F17, (e) represents benchmark function F18, (f) represents benchmark function F19, (g) represents benchmark function F20, (h) represents benchmark function F21, (i) represents benchmark function F22, (j) represents benchmark function F23.
Applsci 11 04795 g007aApplsci 11 04795 g007b
Table 1. Benchmark function properties.
Table 1. Benchmark function properties.
FunctionsDimensionRangeGlobal Optima
F130[−100, 100]0
F230[−10, 10]0
F330[−100, 100]0
F430[−100, 100]0
F530[−30, 30]0
F630[−100, 100]0
F730[−1.28, 1.28]0
F830[−500, 500]−418 × D
F930[−5.12, 5.12]0
F1030[−32, 32]0
F1130[−600, 600]0
F1230[−50, 50]0
F1330[−50, 50]0
F142[−65, 65]1
F154[−5, 5]0.0003
F162[−5, 5]−1.0316
F172[−5, 5]0.398
F182[−2, 2]3
F193[0, 1]−3.86
F206[0, 10]−3.32
F214[0, 10]−10.1532
F224[0, 10]−10.4028
F234[0, 10]−10.5363
Table 2. Parameters of the compared algorithms.
Table 2. Parameters of the compared algorithms.
AlgorithmParameters and Their Numerical Values
PSOc1 = 2, c1 = 2
GSAc1 = 2, c2 = 2, G0 = 1
GWOa = 2 to 0
JAYAN/A
SOGWOa = 2 to 0
IGWOa = 2 to 0
NCSAAwareness probability = 0.30, flight length = 0.20
NGWOa = 2 to 0, NC = 0.5
Table 3. Results of unimodal benchmarks (F1–F7).
Table 3. Results of unimodal benchmarks (F1–F7).
FunctionsIndexPSO
(1998)
GSA
(2011)
GWO
(2014)
JAYA
(2016)
SOGWO
(2020)
IGWO
(2021)
NCSA
(2020)
NGWO
(This Work)
F1Mean1.30 × 10−112.00 × 10−171.36 × 10−701.20 × 10−86.05 × 10−772.98 × 10−713.48 × 10−913.69 × 10−96
SD8.80 × 10−115.50 × 10−182.57 × 10−703.40 × 10−81.49 × 10−765.60 × 10−711.05 × 10−908.25 × 10−96
Min1.10 × 10−151.10 × 10−179.07 × 10−731.71 × 10−103.81 × 10−791.52 × 10−73N/A1.38 × 10−107
F2Mean2.90 × 10−62.40 × 10−85.64 × 10−412.66 × 10−41.18 × 10−441.30 × 10−426.64 × 10−611.09 × 10−73
SD1.30 × 10−54.40 × 10−96.43 × 10−411.70 × 10−41.34 × 10−441.26 × 10−421.71 × 10−605.96 × 10−73
Min4.40 × 10−91.40 × 10−83.59 × 10−421.45 × 10−63.50 × 10−441.36 × 10−43N/A1.46 × 10−88
F3Mean1.20 × 1022.30 × 1021.09 × 10−194.13 × 1005.40 × 10−223.29 × 10−141.60 × 10−271.29 × 10−9
SD7.50 × 1011.00 × 1023.11 × 10−198.26 × 1002.60 × 10−219.26 × 10−145.97 × 10−272.67 × 10−9
Min1.90 × 1017.50 × 1012.56 × 10−252.19 × 10−11.17 × 10−281.69 × 10−18N/A1.70 × 10−18
F4Mean4.20 × 10−16.40 × 10−21.94 × 10−171.52 × 1001.18 × 10−191.01 × 10−141.67 × 10−254.53 × 10−32
SD1.90 × 10−12.50 × 10−13.68 × 10−171.04 × 1001.51 × 10−191.64 × 10−147.64 × 10−251.01 × 10−31
Min1.40 × 10−12.10 × 10−91.28 × 10−182.92 × 10−17.08 × 10−211.72 × 10−16N/A8.49 × 10−40
F5Mean2.70 × 1012.80 × 1012.63 × 1013.73 × 1012.65 × 1012.74 × 1012.68 × 1012.48 × 101
SD8.40 × 1001.00 × 1016.69 × 10−12.54 × 1017.62 × 10−13.05 × 10−14.18 × 10−11.86 × 10−1
Min2.50 × 1012.60 × 1012.51 × 1018.44 × 1002.50 × 1012.47 × 101N/A2.45 × 101
F6Mean1.30 × 10−120.00 × 1004.12 × 10−11.21 × 10−82.83 × 10−11.00 × 10−53.16 × 10−12.66 × 10−4
SD7.10 × 10−120.00 × 1002.45 × 10−12.28 × 10−82.47 × 10−13.07 × 10−62.39 × 10−11.24 × 10−4
Min8.30 × 10−160.00 × 1001.09 × 10−51.24 × 10−106.19 × 10−64.87 × 10−6N/A1.61 × 10−4
F7Mean7.00 × 10−32.80 × 10−25.68 × 10−42.87 × 10−24.93 × 10−47.60 × 10−48.06 × 10−43.41 × 10−4
SD2.50 × 10−31.70 × 10−23.54 × 10−41.09 × 10−22.71 × 10−42.94 × 10−44.80 × 10−42.26 × 10−4
Min1.70 × 10−38.40 × 10−31.49 × 10−41.27 × 10−28.57 × 10−53.33 × 10−4N/A8.38 × 10−5
Table 4. Results of the compared algorithms for multi-modal benchmarks (F8-F23).
Table 4. Results of the compared algorithms for multi-modal benchmarks (F8-F23).
FunctionsIndexPSO
(1998)
GSA
(2011)
GWO
(2014)
JAYA
(2016)
SOGWO
(2020)
IGWO
(2021)
NCSA
(2020)
NGWO
(This Work)
F8Mean−9.00 × 103−2.70 × 103−6.07 × 103−7.66 × 103−6.57 × 103−9.53 × 103−7.25 × 103−1.25 × 104
SD5.20 × 1024.70 × 1025.37 × 1021.01 × 1038.03 × 1021.40 × 1039.86 × 1029.77 × 101
Min−1.00 × 104−4.20 × 103−7.09 × 103−9.66 × 103−8.18 × 103−1.13 × 104N/A−1.26 × 104
F9Mean4.10 × 1011.70 × 1015.20 × 1002.68 × 1010.00 × 1001.42 × 1010.00 × 1000.00 × 100
SD1.50 × 1014.30 × 1001.89 × 1009.89 × 1000.00 × 1005.80 × 1000.00 × 1000.00 × 100
Min1.80 × 1019.00 × 1000.00 × 1001.49 × 1010.00 × 1002.99 × 100N/A0.00 × 100
F10Mean9.10 × 10−83.40 × 10−91.31 × 10−142.63 × 1008.88 × 10−169.41 × 10−154.44 × 10−84.44 × 10−15
SD2.00 × 10−74.10 × 10−102.73 × 10−159.95 × 10−10.00 × 1002.74 × 10−150.00 × 1000.00 × 100
Min4.60 × 10−92.20 × 10−97.99 × 10−159.31 × 10−18.88 × 10−167.99 × 10−15N/A4.44 × 10−15
F11Mean1.20 × 10−24.30 × 1005.23 × 10−41.99 × 10−20.00 × 1009.55 × 10−30.00 × 1000.00 × 100
SD1.20 × 10−21.60 × 1002.61 × 10−32.97 × 10−20.00 × 1003.60 × 10−30.00 × 1000.00 × 100
Min5.10 × 10−152.00 × 1000.00 × 1008.43 × 10−100.00 × 1000.00 × 100N/A0.00 × 100
F12Mean1.50 × 10−132.50 × 10−22.66 × 10−21.58 × 10−15.61 × 10−27.50 × 10−71.66 × 10−22.10 × 10−5
SD3.60 × 10−136.10 × 10−21.55 × 10−22.26 × 10−11.42 × 10−21.81 × 10−78.75 × 10−33.62 × 10−6
Min1.60 × 10−136.20 × 10−26.57 × 10−24.84 × 10−112.62 × 10−24.45 × 10−7N/A1.69 × 10−5
F13Mean2.00 × 10−312.10 × 10−183.25 × 10−11.97 × 1003.53 × 10−11.42 × 10−54.08 × 10−11.12 × 10−2
SD4.30 × 10−315.00 × 10−191.56 × 10−13.71 × 1001.28 × 10−14.31 × 10−62.27 × 10−11.07 × 10−2
Min9.90 × 10−311.22 × 10−181.58 × 10−56.58 × 10−91.42 × 10−57.42 × 10−6N/A3.42 × 10−4
F14Mean1.00 × 1003.80 × 1003.11 × 1009.98 × 10−13.43 × 1009.98 × 10−11.13 × 1009.98 × 10−1
SD3.20 × 10−172.60 × 1003.73 × 1000.00 × 1003.72 × 1005.83 × 10−175.03 × 10−11.11 × 10−16
Min1.00 × 1001.00 × 1009.98 × 10−19.98 × 10−19.98 × 10−19.98 × 10−1N/A9.98 × 10−1
F15Mean1.20 × 10−34.10 × 10−34.36 × 10−34.30 × 10−42.38 × 10−33.07 × 10−43.75 × 10−43.08 × 10−4
SD4.00 × 10−33.20 × 10−38.17 × 10−33.17 × 10−46.03 × 10−36.57 × 10−107.73 × 10−54.22 × 10−8
Min3.10 × 10−31.40 × 10−33.07 × 10−43.07 × 10−43.07 × 10−43.07 × 10−4N/A3.07 × 10−4
F16Mean−1.00 × 100−1.00 × 100−1.02 × 100−1.03 × 100−1.03 × 100−1.03 × 100−1.03 × 100−1.03 × 100
SD2.30 × 10−164.00 × 10−164.80 × 10−96.78 × 10−63.75 × 10−96.78 × 10−62.39 × 10−60.00 × 100
Min−1.00 × 100−1.00 × 100−1.03 × 100−1.03 × 100−1.03 × 100−1.03 × 100N/A−1.03 × 100
F17Mean4.00 × 10−14.00 × 10−13.98 × 10−13.98 × 10−13.97 × 10−13.98 × 10−13.98 × 10−13.98 × 10−1
SD3.40 × 10−163.40 × 10−163.36 × 10−70.00 × 1004.86 × 10−70.00 × 1001.09 × 10−50.00 × 100
Min4.00 × 10−14.00 × 10−13.98 × 10−13.98 × 10−13.98 × 10−13.98 × 10−1N/A3.98 × 10−1
F18Mean3.00 × 1003.00 × 1003.00 × 1003.00 × 1003.00 × 1003.00 × 1003.00 × 1003.00 × 100
SD3.10 × 10−152.20 × 10−154.88 × 10−61.51 × 10−154.63 × 10−61.24 × 10−149.98 × 10−61.49 × 10−15
Min3.00 × 1003.00 × 1003.00 × 1003.00 × 1003.00 × 1003.00 × 100N/A3.00 × 100
F19Mean−3.90 × 100−3.60 × 100−3.86 × 100−3.86 × 100−3.86 × 100−3.86 × 100−3.86 × 100−3.86 × 100
SD3.10 × 10−153.00 × 10−11.05 × 10−32.71 × 10−152.71 × 10−32.71 × 10−152.67 × 10−41.51 × 10−15
Min−3.90 × 100−3.90 × 100−3.86 × 100−3.86 × 100−3.86 × 100−3.86 × 100N/A−3.86 × 100
F20Mean−3.30 × 100−1.90 × 100−3.25 × 100−3.29 × 100−3.27 × 100−3.32 × 100−3.29 × 100−3.32 × 100
SD5.50 × 10−25.40 × 10−17.04 × 10−25.11 × 10−27.37 × 10−23.63 × 10−24.45 × 10−23.83 × 10−7
Min−3.30 × 100−3.30 × 100−3.32 × 100−3.32 × 100−3.32 × 100−3.32 × 100N/A−3.32 × 100
F21Mean−7.20 × 100−5.10 × 100−9.95 × 100−1.02 × 101−9.66 × 100−1.02 × 101−7.73 × 100−1.02 × 101
SD3.30 × 1007.40 × 10−31.01 × 1007.23 × 10−151.51 × 1003.12 × 10−82.19 × 1001.55 × 10−3
Min−1.00 × 101−5.10 × 100−1.02 × 101−1.02 × 101−1.02 × 101−1.02 × 101N/A−1.02 × 101
F22Mean−9.10 × 100−7.50 × 100−1.02 × 101−1.04 × 101−1.04 × 101−1.04 × 101−8.82 × 100−1.04 × 101
SD2.80 × 1002.70 × 1001.05 × 1009.33 × 10−162.66 × 10−43.12 × 10−81.90 × 1004.40 × 10−2
Min−1.00 × 101−1.00 × 101−1.04 × 101−1.04 × 101−1.04 × 101−1.04 × 101N/A−1.04 × 101
F23Mean−9.40 × 100−1.00 × 101−1.03 × 101−1.05 × 101−1.05 × 101−1.04 × 101−9.16 × 100−1.05 × 101
SD2.80 × 1007.80 × 10−11.08 × 1001.81 × 10−155.41 × 10−17.40 × 10−81.86 × 1006.64 × 10−10
Min−1.10 × 101−1.10 × 101−1.05 × 101−1.05 × 101−1.05 × 101−1.04 × 101N/A−1.05 × 101
Table 5. Ranking of the algorithms based on the Friedman test.
Table 5. Ranking of the algorithms based on the Friedman test.
FunctionsPSO
(1998)
GSA
(2011)
GWO
(2014)
JAYA
(2016)
SOGWO
(2020)
IGWO
(2021)
NCSA
(2020)
NGWO
(This Work)
p-Value
F17.0005.9674.4338.0004.0673.4001.8331.2674.902 × 10−39
F27.0606.0004.6676.8774.2672.9672.0001.0008.466 × 10−41
F36.8308.0002.0676.1002.5674.0331.4004.9338.027 × 10−40
F47.0006.0333.1337.9673.5004.9672.4001.0004.720 × 10−40
F55.4004.7005.5334.3675.7332.4004.7671.1331.633 × 10−17
F62.0001.0006.1673.0006.3004.1338.0005.4001.259 × 10−39
F77.9006.3003.4006.8003.3674.3002.4001.5332.906 × 10−35
F85.1338.0006.3334.2675.6002.6673.0001.0001.584 × 10−34
F97.7675.7332.6336.9332.5005.5672.4332.4332.922 × 10−39
F107.0006.0004.2178.0004.4173.2671.6501.4501.618 × 10−40
F116.0008.0003.1336.3333.3833.6172.7332.7003.735 × 10−33
F121.7672.3335.6675.5675.8673.0337.5673.4333.310 × 10−27
F132.3671.3005.9004.8676.2673.0337.7673.5334.302 × 10−32
F144.6176.7336.0672.8175.1332.8175.0002.8173.877 × 10−20
F156.5007.7004.8172.3334.9832.8175.2131.5171.506 × 10−31
F163.7673.7676.7833.7676.6173.7673.7673.7677.581 × 10−28
F173.5173.5177.5673.5177.3333.5173.5173.5174.257 × 10−40
F183.5003.5007.5333.5007.4673.5003.5003.5007.003 × 10−40
F193.5003.5007.5333.5007.4673.5003.5003.5007.003 × 10−41
F204.7502.5506.7003.5836.9003.4333.4002.6837.641 × 10−23
F214.1335.2335.9672.5006.0672.7675.2504.0833.014 × 10−15
F223.4832.9836.8672.9836.9673.0504.7504.9173.309 × 10−25
F234.6501.5006.6331.5006.8674.0335.5005.3172.890 × 10−31
Table 6. Comparative study of Welded Beam design problem.
Table 6. Comparative study of Welded Beam design problem.
Algorithmsy1y2y3y4f (Best)f (Mean)f (Worst)SDNFEs
NGWO0.340943.58109.03210.20632.0362.0462.06321.02 × 10220,100
GA [54]0.24896.1738.17890.25332.4331N/AN/AN/AN/A
SC [55]0.24446.2388.28860.24462.38543.25516.39979.60 × 10−133,095
FSA [56]0.24436.21588.29390.24432.38112.40422.489N/A56,243
AATM [57]0.24416.22098.29820.24442.38232.3872.39162.20 × 10−330,000
HEAA [58]0.24446.21758.29150.24442.3812.3812.3811.30 × 10−530,000
EEGWO [41]0.24446.2178.29280.24442.38132.38172.38244.18 × 10−450,000
Table 7. Comparative study of tension–compression coil spring design.
Table 7. Comparative study of tension–compression coil spring design.
Algorithmsy1y2y3f (Best)f (Mean)f (Worst)SDNFEs
NGWO10.328260.354860.050260.0110530.0117550.0124545.21 × 10−420,100
GA [59]10.890520.363960.051980.012680.012740.012975.90 × 10−580,000
FSA [56]11.213900.358000.051740.012660.012660.012662.20 × 10−849,531
CPSO [60]11.244540.357640.051720.012670.012730.012925.20 × 10−5200,000
HPSO [60]11.265030.357120.051700.012660.012700.012711.58 × 10−581,000
GSA [13]14.228670.050.317310.012870.013430.014211.34 × 10−230,000
GWO [18]12.042490.344540.051170.012670.012690.012722.10 × 10−530,000
AFA [61]11.319560.356190.051660.012660.012660.012801.27 × 10−250,000
TEO [62]11.168390.358790.051770.012660.012680.012714.41 × 10−2300,000
MGWO [63]11.808090.348190.051330.012660.012670.012701.10 × 10−530,000
EEGWO [41]11.31130.356340.051670.012660.012680.012722.22 × 10−550,000
Table 8. Results of three bars truss design problem.
Table 8. Results of three bars truss design problem.
Algorithmsy1y2f (Best)f (Mean)f (Worst)SDNFEs
NGWO0.7891860.406806263.8959263.8964263.89715.58 × 10−420,100
PSO [6]0.7812240.432548264.2183265.9553267.4591.38 × 10030,000
SC [55]0.7886210.408401263.8958263.9033263.96971.30 × 10−217,610
DEDS [64]0.7886510.408316263.8958263.8958263.89589.70 × 10−715,000
GSA [13]0.7776620.448853264.8299271.0348279.79254.12 × 10030,000
HEAA [58]0.788680.408234263.8958263.8959263.89614.90 × 10−515,000
AATM [57]0.7886820.408229263.8958263.8966263.90041.10 × 10−317,000
WCA [65]0.7886510.408316263.8958263.8959263.89628.71 × 10−55250
MBA [66]0.7885650.40856263.8959263.898263.9163.93 × 10−313,280
GWO [18]0.7884090.409003263.8959263.8966263.8984.37 × 10−430,000
MVO [67]0.7889930.407351263.8959263.8961263.89712.49 × 10−430,000
SCA [68]0.7890680.407162263.8984263.9356263.99512.88 × 10−230,000
MGWO [63]0.7885610.408572263.8959263.8963263.89764.29 × 10−430,000
EEGWO [41]0.788410.40899263.896263.8963263.89662.19 × 10−450,000
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ahmed, R.; Nazir, A.; Mahadzir, S.; Shorfuzzaman, M.; Islam, J. Niching Grey Wolf Optimizer for Multimodal Optimization Problems. Appl. Sci. 2021, 11, 4795. https://doi.org/10.3390/app11114795

AMA Style

Ahmed R, Nazir A, Mahadzir S, Shorfuzzaman M, Islam J. Niching Grey Wolf Optimizer for Multimodal Optimization Problems. Applied Sciences. 2021; 11(11):4795. https://doi.org/10.3390/app11114795

Chicago/Turabian Style

Ahmed, Rasel, Amril Nazir, Shuhaimi Mahadzir, Mohammad Shorfuzzaman, and Jahedul Islam. 2021. "Niching Grey Wolf Optimizer for Multimodal Optimization Problems" Applied Sciences 11, no. 11: 4795. https://doi.org/10.3390/app11114795

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop