An improved multi-objective gravitational search algorithm

Multi-objective search algorithm is a common optimization tool to deal with complex multi-objective problems, such as Multiple Objectives Particle Swarm Optimization (MOPSO) and Non dominated Sorting Genetic Algorithm-II (NSGA-II). Gravitational Search Algorithm (GSA) is a new heuristic evolutionary algorithm, which is based on the Newtonian gravity and the laws of motion to search optimal solutions. Some of agents have bigger mass, so other smaller mass agents are affected easily to fall into local optimization. In order to improve the search ability of the algorithm, this paper proposes an Improved Multi-Objective Gravitational Search Algorithm (IMOGSA). The proposed method uses the fast non-dominated sorting strategy and crowding distance of NSGA-II, and uses Sine Cosine Algorithm (SCA). Using strategy of NSGA-II is to reduce the complexity of the algorithm, in addition, using SCA is to improve the convergence and distribution of IMOGSA by improving the weight of acceleration. Finally, the proposed method has been compared with other well-known heuristic search methods by using some benchmark functions.


Introduction
Multi-objective optimization problem refers to deal with two or more optimization problems simultaneously, and their objective functions are contradictory. It is widely used in scientific research and practical engineering applications [1], such as workshop scheduling problem [2], resource allocation and environment [3]. The purpose of single objective optimization problem in calculation is to find out the global optimal solution, but multi-objective optimization problem is different. One solution is better for one of the objectives, however, it may be worse for the other targets. Therefore, a set of non-dominant solutions will be obtained in the calculation. At present, various heuristic optimization algorithms have been applied to multi-objective optimization. For example, Srinivas et al. [4] proposed Non dominated Sorting Genetic Algorithm-Ⅱ (NSGA-Ⅱ) and Coello et al. [5] proposed Multiple Objectives Particle Swarm Optimization (MOPSO). Both of them are classical multiobjective optimization algorithms.
Gravitational Search Algorithm (GSA) is a new heuristic search algorithm proposed by Rashedi et al. [6], and it is inspired by gravitational force between objects in universe. In GSA, the position of each object agent represents a feasible solution, and the mass of the object reflects that the position is better or worse. The larger one could have the better position and the smaller one has a worse position. Moreover, the motion mode of the object in solution space is determined by the gravitation of other objects. The larger mass of the object can get the smaller acceleration and the smaller velocity. In unit time, the lager one gets shorter distance under the same gravitation. On the contrary, the smaller mass can get lager velocity, acceleration, and the longer distance. Because GSA has the advantages of few parameters, better global search ability and fast convergence speed, GSA has been implemented in IOP Publishing doi:10.1088/1742-6596/1978/1/012029 2 many single objective optimization problems, and the effect is remarkable [7][8][9][10][11]. Therefore, Hassanzadeh et al. [12] combined GSA with MOPSO and proposed MOGSA. And Nobahari et al. [13] combined GSA with NSGA-Ⅱ and proposed NSGSA. These algorithms are applied to solve multiobjective optimization problems. However, the research of the algorithm like MOGSA will easy to fall into local optimization because of the relative poor convergence [14]. Mirjalili [15] proposed Sine Cosine Algorithm (SCA) which is used to calculate the location of the current solution with multiple random and adaptive variables. And the algorithm can search different regions in the space to avoid local optimization effectively. This paper proposes IMOGSA which uses the fast non-dominated sorting strategy and crowding distance of NSGA-Ⅱ, and uses SCA. The proposed method uses the strategy of NSGA-Ⅱ to reduce the complexity of the algorithm and uses SCA to improve the convergence and distribution by improving the weight of acceleration. Finally, using the benchmark function compares with other multi-objective optimization algorithms to illustrate the effectiveness of the algorithm.

Gravitational search algorithm
GSA is proposed by using the law of universal gravitation. Two objects in the universe can affect each other through gravitation. Individual can be regarded as a agent in the solution space, and each agent can be regarded as having mass. And the mass is an index to evaluate the agent better or worse. The position of the agent with large mass is regarded as a better solution, and these agents will affect other smaller agents. Then these smaller agents will move to the larger one. Moreover, every agents share their position information to others. After agents receive the information, they will find out better position, and move to this direction. In this way, agents can share information in the evolution, and the algorithm can search the better solution in the space.
Suppose that there are N particles in the space, then Eq.(1) is the position of the ith agent. 1 ( , , ); 1, 2, .
(1) The n is the dimension of the search space and x is the decision variable. k i x is the position of the ith agent in the kth dimension. The gravitational force between two particles can be expressed by Eq.(2).
Mt is the mass of j particle and () pi Mt is the mass of i particle.  is an error, and it usually be expressed by a minimal constant. () ij Rt is the Euclidean distance between two particles in Eq.(3). G(t) is a gravitational constant in Eq.(4).
The particle mass is related to the fitness value. The larger one tends to the optimal solution closer. The larger mass particle can affect other smaller mass particles. The Eqs. (5) and (6) are aim to calculate the mass of a particle.
is the worst value in particle fitness calculation. best f is the best value in particle fitness calculation. Make an example to calculate minimum solution by Eqs. (7) and (8). Smaller particles are subject to acceleration of larger mass particles. The Eq.(9) is to calculate particle acceleration.
k i a is acceleration of particle i, in k dimensional space. The acceleration of small mass particles are affected by larger particles. Eqs. (10) and (11) can update particle velocity and position in evolution.
3. Improved multi-objective gravitational search algorithm

Multi-objective optimization
Multi-objective optimization problems are usually described as Eqs. (12) and (13). In the multi-objective optimization problem, there is no unique global optimal solution. In the process of algorithm optimization, a group of non-dominated solutions can be obtained the dominant relationship between agents and they also be called Pareto optimal solution.

Fast non-dominated sorting and congestion
Fast non-dominated sorting is an improvement of NSGA sort strategy [4]. The strategy combines individual Pareto sequence value and individual crowding distance. Crowding distance is the distance of individuals around a given point in a population. In Figure 1, it shows the distance of individual crowding. From Figure 1, we can see that the crowding distance of individuals a, b and c is less than that of individuals I to VI. So the crowding distance of individuals a, b and c is smaller. The Pareto non-dominated sorting values of individuals I to VI are ranked as 1. The Pareto non-dominated sorting value of individuals from a to c is ranked as 2, and so on. In order to ensure the diversity of population distribution, considering the crowding distance in the range of individual, in Eq.(15) the Pareto ranking value of individual x is defined as the sum of the non-dominated sorting value of the individual dominated by the individual and the current non-dominated sorting value of the individual.     The acceleration of agent can affect another target of agents. And the acceleration is Eq.(18).
The total acceleration of individual is Eq.(19) which is affected by two targets. According to Tariq et al. [15], k  is a random number between (0,1). But the randomly selected weights will not work well in the search. By adding SCA, it improved the weight of acceleration. It enhanced the individual motion mode of the algorithm and improved the convergence ability of the algorithm. Eqs.(20) and (21) are improved weight.

Improve algorithm process
The following is the algorithm steps of IMOGSA.

Numerical Computations
To evaluate a multi-objective algorithm, this paper use Generational Distance (GD) and Spacing (SP). And the following two criteria introduced in [16] are selected for further comparisons:  To evaluate IMOGSA, using the benchmark function [17] ZDT1 ZDT2 and ZDT4 to compared with the original classical multi-objective algorithm, MOPSO and NSGA-Ⅱ. The benchmark function is as follows: ( , ) f f f =  [ 3,3] x − , 2 [ 5,5] x −     Figure 3 and Figure 4 are Pareto optimal front of IMOGSA compared with NSGA-Ⅱ and MOPSO. As easily observed from Table 1 and Table 2, in some aspect, the result of IMOGSA is better than NSGA-Ⅱ and MOPSO. The GD factor in the IMOGSA outperforms NSGA-Ⅱ, and the SP factor has also been improved for the NSGA-Ⅱ and MOPSO. Therefore, the distribution of IMOGSA is better than NSGA-Ⅱ and MOPSO. However, when we compare IMOGSA with MOPSO in GD aspect, the efficiency of the proposed algorithm needs to be improved.

Conclusions
This paper introduces a new improved multi-objective gravitational search algorithm which uses fast non-dominated sorting strategy and SCA. It uses the strategy of NSGA-Ⅱ to reduce the complexity of the algorithm and uses SCA to improve the convergence and uniform distribution by improving the weight of acceleration. In order to evaluate the proposed method, we have examined it on the standard benchmark functions and compared it with the classical multi-objective optimization algorithms. The results showed that the proposed method has some advantages in comparing NSGA-Ⅱ and MOPSO. Moreover, when IMOGSA is compared with MOPSO in GD aspect, it also needs to be improved.
In the future work, we will use more benchmark functions to analysis the proposed algorithm. On the initial population, we will use chaotic map to improve convergence and distribution of IMOGSA.