Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems

A simple yet powerful optimization algorithm is proposed in this paper for solving the constrained and unconstrained optimization problems. This algorithm is based on the concept that the solution obtained for a given problem should move towards the best solution and should avoid the worst solution. This algorithm requires only the common control parameters and does not require any algorithm-specific control parameters. The performance of the proposed algorithm is investigated by implementing it on 24 constrained benchmark functions having different characteristics given in Congress on Evolutionary Computation (CEC 2006) and the performance is compared with that of other well-known optimization algorithms. The results have proved the better effectiveness of the proposed algorithm. Furthermore, the statistical analysis of the experimental work has been carried out by conducting the Friedman’s rank test and Holm-Sidak test. The proposed algorithm is found to secure first rank for the ‘best’ and ‘mean’ solutions in the Friedman’s rank test for all the 24 constrained benchmark problems. In addition to solving the constrained benchmark problems, the algorithm is also investigated on 30 unconstrained benchmark problems taken from the literature and the performance of the algorithm is found better.


Introduction
The population based heuristic algorithms have two important groups: evolutionary algorithms (EA) and swarm intelligence (SI) based algorithms.Some of the recognized evolutionary algorithms are: Genetic Algorithm (GA), Evolution Strategy (ES), Evolution Programming (EP), Differential Evolution (DE), Bacteria Foraging Optimization (BFO), Artificial Immune Algorithm (AIA), etc.Some of the wellknown swarm intelligence based algorithms are:Particle Swarm Optimization (PSO), Shuffled Frog Leaping (SFL), Ant Colony Optimization (ACO), Artificial Bee Colony (ABC), Fire Fly (FF) algorithm, etc.Besides the evolutionary and swarm intelligence based algorithms, there are some other algorithms which work on the principles of different natural phenomena.Some of them are: Harmony Search (HS) algorithm, Gravitational Search Algorithm (GSA), Biogeography-Based Optimization (BBO), Grenade Explosion Method (GEM), etc. (Rao andPatel, 2012, 2013).
All the evolutionary and swarm intelligence based algorithms are probabilistic algorithms and require common controlling parameters like population size, number of generations, elite size, etc.Besides the common control parameters, different algorithms require their own algorithm-specific control parameters.For example, GA uses mutation probability, crossover probability, selection operator; PSO uses inertia weight, social and cognitive parameters; ABC uses number of onlooker bees, employed bees, scout bees and limit; HS algorithm uses harmony memory consideration rate, pitch adjusting rate, and the number of improvisations.Similarly, the other algorithms such as ES, EP, DE, BFO, AIA, SFL, ACO, etc. need the tuning of respective algorithm-specific parameters.The proper tuning of the algorithmspecific parameters is a very crucial factor which affects the performance of the above mentioned algorithms.The improper tuning of algorithm-specific parameters either increases the computational effort or yields the local optimal solution.Considering this fact, Rao et al. (2011) introduced the teachinglearning-based optimization (TLBO) algorithm which does not require any algorithm-specific parameters.The TLBO algorithm requires only common controlling parameters like population size and number of generations for its working.The TLBO algorithm has gained wide acceptance among the optimization researchers (Rao, 2015).
Keeping in view of the success of the TLBO algorithm, another algorithm-specific parameter-less algorithm is proposed in this paper.However, unlike two phases (i.e.teacher phase and the learner phase) of the TLBO algorithm, the proposed algorithm has only one phase and it is comparatively simpler to apply.The working of the proposed algorithm is much different from that of the TLBO algorithm.The next section describes the proposed algorithm.

Proposed algorithm
Let f(x) is the objective function to be minimized (or maximized).At any iteration i, assume that there are 'm' number of design variables (i.e.j=1,2,…,m), 'n' number of candidate solutions (i.e.population size, k=1,2,…,n).Let the best candidate best obtains the best value of f(x) (i.e.f(x)best) in the entire candidate solutions and the worst candidate worst obtains the worst value of f(x) (i.e.f(x)worst) in the entire candidate solutions.If Xj,k,iis the value of the j th variable for the k th candidate during the i th iteration, then this value is modified as per the following Eq.(1).
X 'j,k,i= Xj,k,i+ r1,j,i (Xj,best,k,j,i (Xj,worst,k,i│) , where, Xj,best,iis the value of the variable j for the best candidate and Xj,worst,iis the value of the variable j for the worst candidate.X'j,k,iis the updated value of Xj,k,i and r1,j,i and r2,j,i are the two random numbers for the j th variable during the i th iteration in the range [0, 1].The term "r1,j,i ( (Xj,best,i-│Xj,k,i│)" indicates the tendency of the solution to move closer to the best solution and the term "-r2,j,i (Xj,worst,i-│Xj,k,i│)" indicates the tendency of the solution to avoid the worst solution.X'j,k,iis accepted if it gives better function value.All the accepted function values at the end of iteration are maintained and these values become the input to the next iteration.
Fig. 1 shows the flowchart of the proposed algorithm.The algorithm always tries to get closer to success (i.e.reaching the best solution) and tries to avoid failure (i.e.moving away from the worst solution).The algorithm strives to become victorious by reaching the best solution and hence it is named as Jaya (a Sanskrit word meaning victory).The proposed method is illustrated by means of an unconstrained benchmark function known as Sphere function in the next section.

Demonstration of the working of Jaya algorithm
To demonstrate the working of Jaya algorithm, an unconstrained benchmark function of Sphere is considered.The objective function is to find out the values of xi that minimize the value of the Sphere function.
2 1 min ( ) The known solution to this benchmark function is 0 for all xi values of 0. Now to demonstrate the Jaya algorithm, let us assume a population size of 5 (i.e.candidate solutions), two design variables x1 and x2 and two iterations as the termination criterion.The initial population is randomly generated within the ranges of the variables and the corresponding values of the objective function are shown in Table 1.As it is a minimization function, the lowest value of f(x) is considered as the best solution and the highest value of f(x) is considered as the worst solution.From Table 1 it can be seen that the best solution is corresponding the 4 th candidate and the worst solution is corresponding to the 3 rd candidate.Now assuming random numbers r1 = 0.58 and r2 = 0.81 for x1 and r1 = 0.92 and r2 = 0.49 for x2, the new values of the variables for x1 and x2 are calculated using Eq.( 1) and are placed in Table 2.For example, for the 1 st candidate, the new values of x1 and x2 during the first iteration are calculated as shown below.
= 18 + 0.92 (7-│18│) -0.49(-6-│18│) = 19.64Similarly, the new values of x1 and x2 for the other candidates are calculated.Now, the values of f(x) of Tables 1 and 2 are compared and the best values of f(x) are considered and placed in Table 3.This completes the first iteration of the Jaya algorithm.From Table 3 it can be seen that the best solution is corresponding the 4 th candidate and the worst solution is corresponding to the 2 nd candidate.Now, during the second iteration, assuming random numbers r1 = 0.27 and r2 = 0.23 for x1 and r1 = 0.38 and r2 = 0.51 for x2, the new values of the variables for x1 and x2 are calculated using Eq.(1).Now, the values of f(x) of Tables 3 and 4 are compared and the best values of f(x) are considered and placed in Table 5.This completes the second iteration of the Jaya algorithm.From Table 5 it can be seen that the best solution is corresponding the 1 st candidate and the worst solution is corresponding to the 2 nd candidate.It can also be observed that the value of the objective function is reduced from 113 to 7.7803 in just two iterations.If we increase the number of iterations then the known value of the objective function (i.e.0) can be obtained within next few iterations.Also, it is to be noted that in the case of maximization function problems, the best value means the maximum value of the objective function and the calculations are to be proceeded accordingly.Thus, the proposed method can deal with both minimization and maximization problems.
The above demonstration is for an unconstrained optimization problem.However, the similar steps can be followed in the case of constrained optimization problem.The main difference is that a penalty function is used in the constrained optimization problem to take care of the violation of each constraint and the penalty is operated upon the objective function.The next section deals with the experimentation of the proposed algorithm on 24 constrained benchmark problems given in CEC 2006 (Liang et al., 2006;Karaboga&Basturk, 2007;Karaboga&Akay, 2011).

Experiments on constrained benchmark problems
The objectives and constraints of the considered 24 benchmark functions of CEC 2006 have different characteristics such as linear, nonlinear, quadratic, cubic and polynomial.The number of design variables and their ranges are different for each problem (Liang et al., 2006;Karaboga and Basturk, 2007;Karaboga and Akay, 2011).
To evaluate the performance of the proposed Jaya algorithm, the results obtained by using the Jaya algorithm are compared with the results obtained by the other optimization algorithms such as homomorphous mapping (HM), adaptive segregational constraint handling evolutionary algorithm (ASCHEA), simple multi-membered evolution strategy (SMES), genetic algorithm (GA), particle swarm optimization (PSO), differential evolution (DE), artificial bee colony (ABC), biogeography based optimization (BBO) available in the literature (Karaboga&Basturk, 2007;Karaboga&Akay, 2011).However, these algorithms were experimented only on 13 functions of CEC 2006.Patel and Savsani (2015) extended the application of PSO, BBO, DE and ABC to the remaining 11 functions also.In addition, they had applied TLBO and heat transfer search (HTS) algorithms for all the 24 constrained benchmark functions.Now, the computational experiments are conducted to identify the performance of the proposed Jaya algorithm and the performance of the algorithm is compared with the above-mentioned algorithms.A common platform is provided by maintaining the identical function evolutions for different algorithms considered for the comparison.Thus, the consistency in the comparison is maintained while comparing the performance of Jaya algorithm with other optimization algorithms.However, in general, the algorithm which requires less number of function evaluations to get the same best solution can be considered as better as compared to the other algorithms.However, in this paper, to maintain the consistency in the comparison of competitive algorithms, a common experimental platform described by Patel and Savsani (2015) is provided by setting the maximum number of function evaluations as 240000 for each benchmark function and the static penalty method is used as a constraint handling technique.Just like other algorithms, the proposed Jaya algorithm is executed 100 times for each benchmark function and the mean results obtained are compared with the other algorithms for the same number of runs.For the TLBO algorithm, Patel and Savsani(2015) used a population size of 50 for all the 24 benchmark functions and thus reported inferior values for the TLBO algorithm for many benchmark functions even though the function evaluations were kept as 240000 in each case.It may be noted that population size and the number of generations are common control parameters (i.e.not algorithm-specific parameters) and variation in the common control parameters may also affect the results.Patel and Savsani (2015) had not considered different combinations of population sizes and the number of generations in the case of TLBO algorithm while maintaining the number of function evaluations as 240000.Hence, the results of TLBO algorithm are corrected now and included in Table 6.The corrected results for TLBO algorithm presented in this paper correspond to the population sizes of 100,70,100,40,60,100,50,30,100,70,50,60,80,80,50,100,70,100,50,100,100,50,70 and 80 respectively for G01-G24 functions and the number of generations are decided accordingly keeping the number of function evaluations same as 240000 in each case.It can be seen that the corrected results of TLBO algorithm are better than those reported by Patel and Savsani (2015).Employing a population size of 50 for all the algorithms for solving all benchmark functions by Patel and Savsani (2015) might not have revealed the true potential of the algorithms.It is important to keep the same number of function evaluations for all the algorithms for a benchmark function (instead of same population size for all the algorithms) and this procedure can be applied to other benchmark functions also.

Results and discussion on constrained benchmark functions
The results of the proposed Jaya algorithm for each benchmark function by employing appropriate population sizes while maintaining the function evaluations of 240000 are included in the last column of Table 6.The Table 6 shows the comparative results of G01-G13 benchmark functions obtained by different algorithms for 240000 function evaluations averaged over 100 runs.The 'best (B)', 'worst (W)' and 'mean (M)' values of the 13 constrained benchmark functions (i.e.G01 -G13) attempted by HM, ASCHEA, SMES, GA, PSO, DE, ABC, BBO, HTS, TLBO and Jaya algorithm are shown.The global optimum values expected are given within brackets under each function.It can be observed that the proposed Jaya algorithm has obtained global optimum values for all the benchmark functions except for G10.However, in this case also, the global optimum value obtained by Jaya algorithm is superior to the remaining algorithms.Furthermore, it can be observed that the 'mean (M)' values of all the 13 benchmark functions obtained by Jaya algorithm are better than all other algorithms.
Table 7 shows the comparative results of G14-G24 benchmark functions obtained by different algorithms for 240000 function evaluations averaged over 100 runs.Here also it can be observed that the proposed Jaya algorithm has obtained global optimum values for all the benchmark functions except for G14, G19, G20, GG21, G22 and G23.However, in these cases also, the global optimum value obtained by Jaya algorithm for each of these functions is superior to those given by the remaining algorithms.Furthermore, it can also be observed that the 'mean (M)' values of all the 11 benchmark functions obtained by Jaya algorithm are better than all other algorithms.It may be mentioned here that the HTS algorithm has employed elitism while all other algorithms have not employed the concept of elitism.In fact, a fair comparison means that only the non-elitist algorithms should be compared.However, as only the results of elitist HTS are available in Patel and Savsani (2015), the comparison is made with the same.Even then it can be observed that Jaya algorithm has performed better than the other algorithms including the elitist HTS algorithm in the case of G14-G24 benchmark functions.It may be mentioned here that the elitist HTS algorithm is not an algorithm-specific parameter-less algorithm (as it contains conduction, convection and radiation factors and the results vary with the variation in the values of these factors).
Table 8 shows the success rates of various algorithms for G01-G24 functions over 100 runs.The comparison is made between PSO, BBO, DE, ABC, HTS and TLBO algorithms as these algorithms are applied for all the 24 benchmark functions.The success rate obtained by all the algorithms is 0 in the case of 8 benchmark functions (i.e.G02, G10, G13, G14, G19, G20, G22 and G23).In the case of all remaining 16 benchmark functions, the success rate obtained by the proposed Jaya algorithm is either equal or superior to the other algorithms of PSO, BBO, DE, ABC, HTS and TLBO algorithms.The bold values indicate best results.
Table 9 shows the 'Mean number of function evaluations' required to reach global optimum value by the competitive algorithms for G01-G24 benchmark functions over 100 independent runs (except G02, G10, G13, G14, G19, G20, G22 and G23 functions for which the data is not available in the literature).Here also it can be observed that the proposed Jaya algorithm has obtained better results (i.e.minimum number of mean function evaluations required to reach global optimum value) for all the benchmark functions except for G01 and G12 as compared to PSO, BBO, DE, ABC, HTS and TLBO algorithms.The values of standard deviation of function evaluations for the Jaya algorithm are also comparatively better in the case of many functions.

Statistical tests
It is observed from the results presented in Tables 6-9 that the performance of the proposed Jaya algorithm is better than the other competitive algorithms.However, it is necessary to conduct the statistical tests like Friedman rank test (Joaquin et al., 2011) and Holm-Sidak test (Holm, 1979) to prove the significance of the proposed algorithm.Table 10 shows Friedman rank test for the 'Best' and 'Mean' solutions obtained for G01-G13 functions.Table 11 shows the Friedman rank test for the 'Best' and 'Mean' solutions obtained for G14-G24 functions.The G05, G12, G13 functions were omitted by the previous researchers as the results of these functions are not available for some of the competitive algorithms and G22 function was omitted as none of the competitive algorithms had produced feasible solution for this function.Hence the same approach is used in the present work and it can be easily observed from Tables 10 and 11 that the proposed Jaya algorithm has got the 1 st rank in the case of "Best" and "Mean' solutions of all benchmark functions considered.The corrected TLBO algorithm has obtained the 2 nd rank.Table 12 shows the results of Friedman rank test for the 'Success Rate' solutions obtained.The proposed Jaya algorithm has obtained the 1 st rank again followed by TLBO.
( ,5,100, 4), ( , , , ) Friedman rank test is used to rank the algorithms based on the results obtained by the algorithms.However, this test does not specify any statistical difference in the results and hence Holm-Sidak test is used to determine the statistical difference between the algorithms.Table 13 shows the Holm-Sidak test for the 'Best' and the 'Mean' solutions obtained for G01-G13 functions and Table 14 shows the corresponding values for G14-G24 functions.The pairwise p-vaues obtained from the Holm-Sidak test for all the algorithms show the statistical difference between the proposed Jaya algorithm and the other algorithms.The statistical difference between the proposed Jaya algorithm and the TLBO algorithm is smaller.

Table 16
Results obtained by the Jaya algorithm for 30 bench mark functions over 30 independent runs with 500000 function evaluations

Experiments on unconstrained benchmark problems
The performance of the proposed Jaya algorithm is tested further on 30 unconstrained benchmark functions well documented in the optimization literature.These unconstrained functions have different characteristics like unimodality/multimodality, separability/non-separability, regularity/non-regularity, etc.The number of design variables and their ranges are different for each problem.Table 15 shows the 30 unconstrained benchmark functions.To evaluate the performance of the proposed Jaya algorithm, the results obtained by using the Jaya algorithm are compared with the results obtained by the other optimization algorithms such as GA, PSO, DE, ABC and TLBO.A common platform is provided by maintaining the identical function evaluations for different algorithms considered for the comparison.Thus, the consistency in the comparison is maintained while comparing the performance of Jaya algorithm with other optimization algorithms.However, in general, the algorithm which requires less number of function evaluations to get the same best solution can be considered as better as compared to the other algorithms.However, in this paper, to maintain the consistency in the comparison of competitive algorithms, a common experimental platform is provided by setting the maximum number of function evaluations as 500000 for each benchmark function.Just like other algorithms, the proposed Jaya algorithm is executed 30 times for each benchmark function by choosing suitable population sizes and the mean results obtained are compared with the other algorithms for the same number of runs.

Results and discussion on unconstrained benchmark functions
The results of Jaya algorithm corresponding to each benchmark function are presented in Table 16 in the form of best solution, worst solution, mean solution and standard deviation obtained in 30 independent runs on each benchmark function.The performance of Jaya algorithm is compared with the other wellknown optimization algorithms such as GA, PSO, DE, ABC and TLBO and the results are given in Table 17.The results of GA, PSO, DE and ABC are taken from Karaboga and Akay (2009) and the results of TLBO are taken from Rao and Patel (2013) where the authors had experimented benchmark functions each with 500000 function evaluations with best setting of algorithm-specific parameters (except for TLBO for which, like Jaya algorithm, there are no algorithm-specific parameters).It can be observed from Tables 16 and 17 that the proposed Jaya algorithm has obtained better results in terms of "best', "mean' and 'worst' values of each objective function and 'standard deviation'.Furthermore, it can be observed that the performance of the Jaya algorithm is found either equal or better than all other algorithms in all the 30 unconstrained benchmark functions.

Conclusions
All the evolutionary and swarm intelligence based algorithms require proper tuning of algorithm-specific parameters in addition to tuning of common controlling parameters.A change in the tuning of the algorithm-specific parameters influences the effectiveness of the algorithm.The recently proposed TLBO algorithm does not require any algorithm-specific parameters and it only requires the tuning of the common controlling parameters of the algorithm for its working.Keeping in view of the success of the TLBO algorithm in the field of optimization, another algorithm-specific parameter-less algorithm is proposed in this paper and is named as 'Jaya algorithm'.However, unlike two phases (i.e.teacher phase and the learner phase) of the TLBO algorithm, the proposed algorithm has only one phase and it is comparatively simpler to apply.
The proposed algorithm is implemented on 24 well defined constrained optimization problems having different characteristics given in CEC 2006 competition.The results obtained by the proposed Jaya algorithm are compared with the results of well-known optimization algorithms such as HM, ASCHEA, SMES, GA, PSO, DE, ABC, BBO, HTS and TLBO algorithms for the considered constrained benchmark problems.Results have shown the satisfactory performance of Jaya algorithm for the constrained optimization problems.The statistical tests have also supported the performance supremacy of the proposed method.
The proposed algorithm is also implemented on 30 well defined unconstrained optimization problems documented in the optimization literature.These unconstrained optimization problems have different characteristics and the results obtained by the proposed Jaya algorithm are compared with the results of well-known optimization algorithms such as GA, PSO, DE, ABC and TLBO algorithms.Results have shown the satisfactory performance of Jaya algorithm for the considered unconstrained optimization problems.It may be concluded that the proposed Jaya algorithm can be used for solving the constrained as well as unconstrained optimization algorithms.
It is emphasized here that the proposed Jaya algorithm is not claimed as the 'best' algorithm among all the optimization algorithms available in the literature.In fact, there may not be any such 'best' algorithm existing for all types and varieties of problems!However, the Jaya algorithm is a newly proposed algorithm and is believed to have strong potential to solve the constrained and unconstrained optimization problems.If the algorithm is found having certain limitations then the efforts of the researchers should be to find out the ways to overcome the limitations and to further strengthen the algorithm.The efforts should not be in the form of destructive criticism.What can be said with more confidence at present about the Jaya algorithm is that it is simple to apply, it has no algorithm-specific parameters and it provides the optimum results in comparatively less number of function evaluations.Researchers are Initialize population size, number of design variables and termination criterion Identify best and worst solutions in the population Modify the solutions based on best and worst solutions Is the termination criterion satisfied?

Table 1
Initial population Table 2 shows the new values of x1 and x2 and the corresponding values of the objective function.

Table 2
New values of the variables and the objective function during first iteration

Table 3
Updated values of the variables and the objective function based on fitness comparison at the end of first iteration Table 4 shows the new values of x1 and x2 and the corresponding values of the objective function during the second iteration.

Table 4
New values of the variables and the objective function during second iteration

Table 5
Updated values of the variables and the objective function based on fitness comparison at the end of second iteration

Table 6
Comparative results of G01-G13 benchmark functions obtained by different algorithms Result in boldface indicates better performing algorithm.(-) indicates that the results are not available.NF: means that no feasible solutions were found.The bold values indicate best results.a: used decoder-based penalty method for constraint handling; b: used static penalty method for constraint handling.;c: used Deb's method for constraint handling.

Table 7
Comparative results of G14-G24benchmark functions obtained by different algorithms for 240000 function evaluations averaged over 100 runs

Table 8
Success rate of various algorithms for G01-G24 functions over 100 runs

Table 10
Friedman rank test for the 'Best' and 'Mean' solutions obtained for G01-G13 functions

Table 13
Holm-Sidak test for the 'Best' and the 'Mean' solutions obtained for G01-G13 functions

Table 15
Unconstrained benchmark functions considered(Continued)

Table 17
Comparative results of Jaya algorithm with other algorithms over 30 independent runs