A multi-group firefly algorithm for numerical optimization

To solve the problem of premature convergence of firefly algorithm (FA), this paper analyzes the evolution mechanism of the algorithm, and proposes an improved Firefly algorithm based on modified evolution model and multi-group learning mechanism (IMGFA). A Firefly colony is divided into several subgroups with different model parameters. Within each subgroup, the optimal firefly is responsible for leading the others fireflies to implement the early global evolution, and establish the information mutual system among the fireflies. And then, each firefly achieves local search by following the brighter firefly in its neighbors. At the same time, learning mechanism among the best fireflies in various subgroups to exchange information can help the population to obtain global optimization goals more effectively. Experimental results verify the effectiveness of the proposed algorithm.


Introduction
Optimization problems in practical engineering are large-scale, nonlinear characteristics. The traditional optimization algorithms, such as hill climbing algorithm and chaotic search algorithm, are often difficult to solve this kind of problems. Therefore, swarm intelligent algorithm for the global optimization problems have been proposed in recent. Examples of notable swarm intelligent algorithms are particle swarm optimization algorithm (PSO) [1], artificial bee colony algorithm (ABC) [2], and ant colony optimization (ACO) [3].
Biological characteristics of firefly light is used to pass information for reference, by which Yang [4] proposed a new swarm intelligence algorithm in 2008, firefly algorithm (FA). The main idea of this algorithm is as follows. Multiple fireflies are randomly distributed in the whole search space, and all fireflies have its light intensity, corresponding to the fitness value of the optimization problem. Then each individual flies following the firefly with stronger light intensity in its visual range. After multi-iterations, all individuals gather around the best firefly, which represents the final optimization. As firefly algorithm has simple structure and can be implemented easily, it has been widely used in image processing [5], automatic control [6], economic emission dispatch [7], and other fields. However, the standard firefly algorithm also has problems of premature convergence and evolutionary stagnation.
To solve these problems, Gandomi [8] proposed a method combining chaos optimization algorithm to improve the fireflies' global optimization ability. Fu [9] put forward a multi-group firefly algorithm, in which the firefly population is divided into multiple sub-populations to enhance the diversity. Adil [10] developed an effective FA with partial random restarts and with an adaptive move procedure. Iztok Fister [11] used quaternion to represent individuals in firefly algorithm so as to enhance the performance of the firefly algorithm and to avoid stagnation. Although these methods improved the performance of optimization of firefly algorithm to some extent, "premature" phenomenon in solving high-dimensional or complex problem still exists. In view of above, this paper analyzes the causes of premature convergence in the algorithm by studying the evolution mechanism of firefly algorithm, and puts forward an Improved Multi-Group Firefly Algorithm based on improved evolutionary mechanism (IMGFA). The simulation experimental results show that the modified firefly algorithm can avoid getting into local optimal value, and achieve better optimization performance than the standard firefly algorithm.
Section 2 presents and analyzes the standard firefly algorithm. Section 3 puts forward a multigroup firefly algorithm based on improved evolutionism. The experimental results and discussion on benchmark functions are presented in Section 4. Section 5 gives a conclusion to the work and suggests some directions for future studies.

Algorithm Description
There are two important key factors arise in the standard firefly algorithm, namely, the light intensity and the attractiveness. Each firefly's fitness determines its light intensity. The better fitness means the stronger light intensity, and shows more intense attractiveness in the areas.
Assuming the total number of fireflies is m , which are randomly initialized in the search space.
Where,  is a randomization parameter generated from interval [0, 1], randn is a random number drawn from Gaussian distribution.

Algorithm Optimization Mechanism
It can be seen from equation 2.4, the optimization process of firefly populations are realized by reflects the learning behavior of firefly to the more attraction firefly in its visual range .
can effectively expand the search range of the firefly populations, prevent premature local optimum. When a firefly cannot find better fireflies in the neighborhood, it is still able to have the ability to move in a random location. Fireflies exchange information sufficiently in a small area so that they can influence each other, implement evolution quickly, and get the goal finally. But when in a larger search area or solving highdimensional optimization problem, some fireflies maybe difficult to find and be attracted by the brighter one for the alienated distribution. As a result, they are unable to obtain the evolutionary information from other individuals, and become the "failure" vibrating randomly at their original position, which greatly affects the group's optimization ability, and cause premature convergence and evolutionary stagnation. If failed to produce a valid "Evolution Team" in the vicinity of the optimal target when random initialization, the firefly populations will be difficult to obtain the optimal target in the follow iteration. Therefore, only when it has close connection among fireflies, firefly algorithm can achieve the global optimization target quickly and efficiently.

A Multi-Group Firefly Algoritnm Based on Improved Evolutionism(Imgfa)
Each firefly seeks for the better optimization value by following the brighter firefly nearby, so there are multiple attractors to guide the population evolution, and to a certain extent, improve the robustness of the algorithm. From equation 2.4, we can see that, the attractiveness  exponentially decay with the increasing of ij r , which helps the fireflies to realize evolution when fully interaction among fireflies exists. But once ij r increases exceeding a certain range, the interaction will interrupt.
That means most of the fireflies will be isolated, and then the evolutionary stagnation occurs.
So this paper adjusts the evolutionary mechanism of the FA, and puts forward an improved firefly algorithm, which can appropriately enhance the interactive communication among fireflies.

Improved Evolutionary Computation Model
In firefly algorithm, each firefly with higher light intensity is able to attract its neighboring fireflies, but loses appeal when the distance between with others becomes far. In order to prevent the firefly populations from interrupting internal information transmission, and promote fireflies' activity, the best fitness value of the firefly population is emphasized in IMGFA. The improved evolutionary computation model is showed as: is the location of the brightest firefly in the population at time t.
indicates that the brightest firefly will attracts all others when information interruption arises. rand is a randomization parameter generated from interval [0,1]. k is the global attractiveness coefficient reducing gradually in iterations, which can avoid the growth of the disabled firefly efficiently . S t is the global attractiveness threshold. When the attractiveness of a firefly is less than S t , the firefly will be failure one for losing contact with the others. Then, the brightest firefly undertakes the mission of dragging the stray firefly to establish effective information interaction with the others.
Comparing with equation 2.4, the improved evolutionary computation model underlines the global attractor created from the brightest firefly in the population. From equation 3.1, each firefly can also receive evolution information even being out of touch with the neighbors. The global attractor also promotes the diversity of population evolution, locates the optimal solution quickly at the beginning of the evolution.

Simulated Annealing Rule
Simulated annealing rule works by simulating a stochastic process on heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. Transitions between states are governed by a rule that favors states with lower cost or "energy". The temperature starts high and is gradually reduced with time according to a specified cooling schedule, in the belief that different features of the solution will "crystallize out" at different temperatures. At low-temperature equilibrium, low energy state occur much higher probability than high-energy ones. Simulated annealing rule is used in this paper to avoid premature convergence. Assume the current status of fireflies as ) ,..., , ,

Multi-Group Optimization Mechanism
This paper divides the firefly populations into n subgroups randomly disturbed in the feasible space. The parameters  ,  , and 0  , are set up in different model, by which each sub-group will implement evolution in diversified styles of "Individual cognition" and "Social cognition ", and achieve better global optimization results.
In each of sub-groups, a bulletin board records the best firefly's fitness and position in each subgroup per iteration, and recruits the best fireflies into elite team. Premature subgroup is found when the best fitness of it does not update in five successive iterations. And then, the best firefly in the premature sub-group can seek for help from the elite team, and jump out of the current local optimal area. Elite team constructs the information exchange channel between each subgroup for mutual learning, and provides more effective guidance to puberty subgroups in premature convergence.
If premature subgroup cannot get help from elite team lacking of a better guide, it will adopt Gauss mutation to healing itself, sort the fireflies by fitness value, choose the first firefly and the last firefly to exchange their position, and update the first one by Gauss mutation. The equation is given as: is a Gaussian distributed random vector, who's expectation is  and variance is  .
It increases the possibility of population diversity to adjust itself in the premature subgroup, which can take the subgroup out of the local area, and achieve the global optimization goal effectively.

Adaptive Distance Exponent Weight
The distance exponent weight  is an extremely important parameter in firefly algorithm. When  is low, a firefly owns wider vision space, and searches for more neighbor fireflies to information exchange extensively. And when  is large, the fireflies' vision field will narrow for local search. As showed in equation 2.3, the  equals to 2, which is not enough to coordinate the requirement of global search and local convergence. Bearing in mind these observations, we propose adaptive distance exponent weight showed in equation 3.3.
Where, max  is the upper bound of  , min  is the lower bound of  . G is the total number of iterations.
i is the number of the current iteration. Equation 3.3 indicates that  will increase in each iteration. In the early period of the algorithm running time, the smaller  is conducive to jumping out of local minimum area and good for accomplishing the task of global optimization, while in the later period of running time, the larger  is conducive to rapidly reaching local convergence. Better adaptability is obtained for firefly algorithm by adjust  according to the change of the iterations. Step4: replace the firefly by ibest f with the firefly by kbest f ; otherwise, in the subgroup i, choose the best firefly and the worst firefly to exchange their position, and update the worst one by Gauss mutation according to equation 3.2.
Step5: Adjust  according to equation 3.3, and implement domain constraint for all fireflies; Step6: Determine whether the convergence criterion has been met. If satisfied, go to step7; otherwise, return to step2; Step7: Output results, end of running. The time complexity of the algorithm is O (m 2 )

Experimental Result and Discussion
In this section the IMGFA algorithm is benchmarked on six classical benchmark functions which have been extensively used. The benchmark functions include three fixed-dimension multimodal functions, and three varied-dimension functions showed in Table 1 and Table 2, in 4 f and 5 f are unimodal functions, and 1 f , 2 f , 3 f , and 6 f are multimodal function. The total number of fireflies is 60 in each test function. There are 3 sugroups in the IMGFA algorithm, and 20 fireflies in each sub population. Other parameters are shown in Table 3. In order to evaluate IMGFA algorithm intuitively and comprehensively, this paper also uses ABC and PSO to join the test. The population size in ABC and PSO is also set to 60. Each benchmark function runs 20 times independently. All of the varied-dimension benchmark functions are in 10, 20 and 30 dimensions. The statistical results (mean and standard deviation) are reported in Table 4~7.   Simulation results form Table 4 shows that, compared with FA, ABC, PSO, IMGFA has better evolution performance in accuracy, and better robustness with smaller variance. Especially for 1 f and 2 f , IMGFA can obtain the minimum value in each test. According to the results of Table 5~7, IMGFA is able to provide very competitive results on the varied-dimension benchmark functions as well. IMGFA captures the global best value for 4 f and 6 f in 10, 20 and 30 dimensions. For 5 f , IMGFA doesn't get the final goal, but it also shows better accuracy and stability of solution than all others.

Conclusions
Although having the advantages of simple structure, less adjustable parameters, and good optimization ability, the standard firefly algorithm is prone to premature stagnation when solving high dimensional optimization problems or in large solving area. This paper studies the evolution mechanism of the firefly algorithm, and proposals an improved firefly algorithm, IMGFA. IMGFA emphasizes the traction force of the best firefly to the "failure" firefly, sets the multi-subgroups net to enhance the diversity of firefly populations, and solves the problem of premature convergence by Simulated annealing rule and Gaussian mutation. Simulation results on benchmark functions show that IMGFA