Enhanced Differential Evolution Algorithm with Local Search Based on Hadamard Matrix

Differential evolution (DE) is a robust algorithm of global optimization which has been used for solving many of the real-world applications since it was proposed. However, binomial crossover does not allow for a sufficiently effective search in local space. DE's local search performance is therefore relatively poor. In particular, DE is applied to solve the complex optimization problem. In this case, inefficiency in local research seriously limits its overall performance. To overcome this disadvantage, this paper introduces a new local search scheme based on Hadamard matrix (HLS). The HLS improves the probability of finding the optimal solution through producing multiple offspring in the local space built by the target individual and its descendants. The HLS has been implemented in four classical DE algorithms and jDE, a variant of DE. The experiments are carried out on a set of widely used benchmark functions. For 20 benchmark problems, the four DE schemes using HLS have better results than the corresponding DE schemes, accounting for 80%, 75%, 65%, and 65% respectively. Also, the performance of jDE with HLS is better than that of jDE on 50% test problems. The experimental results and statistical analysis have revealed that HLS could effectively improve the overall performance of DE and jDE.


Introduction
Differential evolution (DE), which was proposed by Storn for solving Chebyshev inequality in 1995 [1], is a well-known numerical optimization algorithm. Due to its simple structure, limited number of parameters, an easy implementation, and outstanding optimization performance, DE has drawn great attention of many researchers and engineers since it was proposed. Over the past two decades, DE has been successfully applied to a variety of fields, such as computer vision [2], dynamic economic dispatch [3], engineering design [4], project scheduling [5], artificial neural networks [6], and complex problems inherent to magnetorheological fluids of interest to the automotive industry, in the framework of extended irreversible thermodynamics [7,8]. Unlike other population-based evolutionary algorithms, the mutation operator in DE utilizes differential information between individuals in the current population. e mechanism gives DE an obvious edge over other evolutionary algorithms. e binomial crossover, however, only produces one offspring in the space constructed by the target individual and its descendant. erefore, the trial individual is just only one case of many potential solutions, and other potential solutions are ignored. Hence, it is clear that DE's search of the subspace is insufficient. is clearly affected the overall performance of DE.
To fill this gap, we introduced a new scheme of local search based on the Hadamard matrix (HLS) for the sake of improving the overall performance of DE. e remainder of this paper is organized as follows. Section 2 introduces the basic elements of DE algorithm and reviews the related work. Section 3 presents the details about the Hadamard local search. e experimental results are reported in Section 4, while Section 5 concludes this paper.

Background
2.1. Differential Evolution. DE algorithm consists of the following four steps.

Initialization.
Initialization is the first step of DE algorithm. It randomly generates a population which contains NP individuals in D-dimensional space. For the ith individual, the jth parameter was initialized by the following formula: where Rand(0, 1) is a uniformly distributed random number within the range [0, 1] and L j and U j are the lower and upper bounds of the dimensional spaces, i ∈ [1, NP], j ∈ [1, D].

Mutation
Operator. Following initialization, the mutation operator was applied to each target individual x G i , thus generating a mutant v G i . In view of the important implications that the mutation operators have on the ability of DE's global search, many researchers focus on the work of improving them. Six efficient and widely used operators [9] are listed below.
(i) DE/rand/1/: (ii) DE/best/1: (iii) DE/rand/2: (iv) DE/best/2: (v) DE/rand-to-best/1: (vi) DE/current-to-best/1: 2.1.3. Crossover Operator. Crossover operator randomly combines the genes of the target and its mutant to produce a new offspring. e binomial crossover is the most commonly used method. It is expressed as follows: where rand(0, 1) is a uniform random number in [0, 1] and CR is the crossover probability.

Selection
Operator. e DE selection operator is a greedy strategy. From the target individual and its offspring, the one with the better fitness value will enter the next generation. e selection operator is shown in the following formula:

Related Work
Although DE has been successfully applied in many fields, it still needs to improve the performance of the algorithm in many other fields. us, several improved versions of DE were proposed by the researchers. ese works can be divided into the following four categories.  [11]. Gong and Cai proposed a classification-based mutation strategy for DE [12], in which some of the parents are selected proportionally based on their classification in the current population. Peng et al.
proposed an improvement in differential evolution, which was named RNDE. RNDE used a new mutation operator, DE/neighbor/1, to balance the exploration and exploitation ability of DE process [13].  [21]. In the FIR, the search space around the best individual is explored greedily in each generation. Later, two implementations (DEfirDE and DEfirSPX) of FIR were proposed. e results of the experiments show that both schemes speed up DE for a set of well-known test functions, especially for high dimensions, and they are better than the other two well-known variants of DE. A crossover-based adaptive local search (LS) operator was proposed to enhance the performance of the standard DE algorithm [22]. e new algorithm mainly improved the local search by adaptively adjusting the length of the search using a hill-climbing heuristic. Trigonometric local search (TLS) and interpolated local search (ILS) were proposed in [23]. Combining these two local search strategies, two new variants of DE algorithms (DETLS and DEILS) were implemented. e new scheme improved the performance of DE in terms of the quality of solution without compromising on the convergence rate. A restart differential evolution algorithm with local search mutation (RDEL) was proposed in [24]. In RDEL, a novel local mutation rule based on the positions of the best and the worst individuals among the entire population of a particular generation is introduced. Also, it was combined with the basic mutation rule through a linear decreasing function. e new local mutation effectively enhanced the local search tendency of the basic DE and accelerated the convergence speed. An adaptive local search for dynamically balancing the degree of global search (GS) and local search (LS) was proposed in [25]. In this adaptive local search, if LS performs better than GS, it will increase its preference for utilization. If LS does not perform well, it will reduce its preferences for LS. e performance of the new algorithm for hybridization of the adaptive LS scheme is evaluated by using 10 benchmark problems, and the results prove the effectiveness of the algorithm. An enhanced differential evolution with random local search (named DERLS) was proposed in [26]. e advantage of using random local search in DERLS is to make a small random "jump" to a more promising area in the solution space, thus avoiding the local optimum. It is very simple, fast in calculation, and more efficient for multimode functions than classical DE. Peng et al. proposed a heterozygous differential evolution with Taguchi local search, which effectively enhances the local search performance of DE [27].

e New
Inspired by local search methods, this paper uses the Hadamard matrix to construct the local search for DE.

Motivation.
A crossover operator is a recombination operator that generates an offspring around the parents. erefore, a local search strategy can be regarded as a moving operator [22]. In traditional DE, the binomial crossover operator (the most commonly used crossover operator) only generates and evaluates one single trial vector, which is a vertex of the hyper-rectangle defined by the mutant vector and the target vector [17]. at is to say, only one of many combinations is obtained. As a result, the search for space around parents is inadequate. On the other hand, if all vertices of the super-rectangle defined by the mutation vector and the target vector are checked, a lot of computation is needed. In this paper, a compromise method is used to construct a local search operator by using Hadamard matrix to search several vertices.

Local Search Based on Hadamard Matrix.
A Hadamard matrix is a square matrix whose entries are either +1 or −1 and whose rows are mutually orthogonal. For example, a fourth-order Hadamard matrix (H4) is represented as follows: In geometric terms, this means that each pair of rows in the Hadamard matrix represents two vertical vectors [28]. Except for the first line, half of the elements in each row contain +1 and the other half −1. is feature allows us to construct a new local search strategy based on Hadamard matrix (HLS).
Based on the above characteristics of Hadamard matrix, we propose a new local search operator. We take the fourthorder Hadamard matrix H4 as an example. Since the scale of the optimization problem d is generally greater than 4, it is impossible to create the crossover operator directly on H4. To use H4, the d-dimensional space of the optimization problem needs to be divided into several subspaces. For example, if the dimension size of the optimization problem is 10, the interval [1,10] should be randomly divided into four subintervals, and each interval corresponds to an element of H4. Figure 1 shows an example of HLS.
Algorithm 1 presents the steps of HLS. With HLS, v1 and x1 will produce four offspring, among which v1 is the mutant individual of x1. Due to the characteristics of Hadamard matrix, these four offspring are four random combinations of v1 and x1. Compared with the traditional crossover operator, HLS can search more completely in local space and find better solution more easily. erefore, HLS will improve the search performance when classical crossover cannot find a better solution.

New Framework of DE with HLS.
ere are two common ways to use the local search operator in DE algorithm. One is to replace the original crossover operator with local search operator, just as OXDE [17] did. Another method is to select an individual to perform a local search independently during evolution. In essence, for local search, the goal is to find better offspring than the target individual. erefore, when the crossover operator can produce better offspring, there is no need for local search. In the process of evolution, the success Computational Intelligence and Neuroscience rate of individual renewal is relatively fast in the early stage of evolution, while in the late stage of evolution, the success rate of individual renewal is very slow and even tends to zero.
In the following, we will use three representative functions: Quartic with Noise, Penalized1, and Shift Ackley, to give the convergence process. In the experiment, the population size was set to 30. Figure 2 shows a successful single update in solving these three functions. As can be seen from Figure 2, almost all individuals cannot be successfully updated at the later stage.
To improve the success rate of DE during the later stage, a new framework of DE with HLS was proposed. HLS operator will not affect the performance of the algorithm in the early stage of evolution, but it can effectively avoid premature convergence in the late stage of evolution. e new framework is presented in Algorithm 2. To avoid consuming too much evaluation resources, the framework uses HLS with the specified probability p (see Algorithm 2, Step 13). During our experiments, P is set to 0.1. In addition, to make full use of the information of the evolution process, HLS uses a mutation vector to construct the local search (see Algorithm 2, Step 14).

Computational Complexity.
For DE with HLS, computational complexity is determined by the number of times the three operators of DE and HLS are executed. Also, its execution time is proportional to the search space dimension. Consequently, DE with HLS has a worst case time complexity on the order of, where is size of population, is the maximum iteration number, and P is the user-defined probability to execute HLS. It is easy to deduce that the time complexity of DE with HLS is. In [9], the time complexity of DE is O(D × NP × GMAX). erefore, the time complexity of DE with HLS is the same as that of DE.

Quality of the HLS.
In this section, four classical schemes of DE, namely, DE/rand/1, DE/best/1, DE/rand-to-best/1, and DE/current-to-best/1, are used in the experiment to evaluate the quality of HLS. To distinguish them, these four diagrams are, respectively, named DE1, DE2, DE3, and DE4. e HLS operator has been integrated into each of the four schemes above, under the names DE1HLS, DE2HLS, DE3HLS, and DE4HLS. To guarantee the fairness of the experiment, the same parameters are defined for all the algorithms. Table 2 presents the results of the experiment. "Average error" and "standard error" represent, respectively, the average value of error of the function and the standard deviation obtained by all the algorithms. e results of the Wilcoxon rank sum test are marked "-," "+," and "≈" in the table to indicate that the performance of DE without HLS is lower, better, and similar to that of DE with HLS. In addition, Figures 3-10 show the evolutionary processes of the two competitors.
From the results of Table 2, we can see that the HLS can greatly improve the performance of the four classic DE schemes. For the 20 benchmark problems, the number of functions with better results from the four DE schemes with HLS than the DE schemes was 16, 15, 13, and 13, respectively. is improvement suggests that HLS promotes the performance of the majority of test functions. Also, the number of functions with worse results from the four DE schemes with HLS than the DE schemes was 3, 4, 7, and 7, respectively. ese functions are mainly unimodal functions. One of the reasons is that the solutions of the unimodal functions are easy to obtain, but HLS has increased the number of evaluations. erefore, we concluded that HLS can effectively improve the performance of DE, especially with multimode and shifted/offset functions.

Effect of the Size of Hadamard Matrix Dimension.
In this section, the effect of dimension size of Hadamard matrix was analysed. Since the dimension size of the matrix is a multiple of 2 or 4, to reduce the computing burden, the dimension sizes of Hadamard matrices 4, 8, and 16 are used for experiments (written as HLS-4, HLS-8, and HLS-16). On the other hand, as can be seen from the analysis in Section 4.3, DE1HLS (DE/rand/1 + HLS) is the best one among the four schemes. us, DE1HLS is used in the experiment. Table 3 summarizes the results of the experiment, in which " †" represents the best solution for these three solutions. e statistical results are in the last row of the table.
As can be seen from the results in Table 3, when the Hadamard matrix dimension is 4, the optimal solutions of 16 problems are better than the other two dimension sizes (8 and 16). Table 4 further gives the average ranking of the Hadamard matrices in three different dimensions (based on the Friedman test). e best average ranking is HLS_4. us, the following experiment will use HLS_4.   Table 5.
Compared with jDE, jDEHLS is superior to jDE on 10 functions and similar to jDE on 10 functions. Compared with SaDE, jDEHLS is superior to SaDE on 10 functions, but inferior to SaDE on 3 functions and similar to SaDE on 7 functions. Compared with ODE, jDEHLS is superior to ODE on 13 functions, but inferior to ODE on 1 function and similar to ODE on 6 functions. Compared with OXDE, jDEHLS is superior to OXDE on 17 functions, but inferior to OXDE on 1 function and similar to OXDE on 2 functions.
In short, HLS can improve the performance of jDE. On 20 test functions, the performance of jDEHLS is the best one among the five methods.
To judge whether the results of the five methods differ in a statistically significant way, a nonparametric statistical test called Friedman test is conducted. e test results are presented in Table 6. As shown in Table 6, the average ranking values for these five algorithms can be sorted in the following order: jDEHLS, jDE, ODE, SaDE, and OXDE.

Conclusion
is paper presents a new local search operator based on Hadamard matrix, which is called HLS. HLS searches the subspace defined by two randomly selected individuals. HLS can improve the probability of finding a better solution in a specified space by constructing multiple offspring. is is very beneficial to promote the balance between exploration     and exploitation. Implementation of four classical DE algorithms and one DE variant, jDE, demonstrates the effectiveness of HLS. In future work, a parameter adaption mechanism for HLS is expected to be developed. In addition, using HLS for large-scale optimization problems will be considered. Finally, the proposed HLS operator may be used to tackle some complex real-world optimization problems. e source code of DEHLS is available at https://github. com/gitdxg110/DEHLS_v1.

Data Availability
e data used to support the findings of this study are included within the article.

Conflicts of Interest
e authors declare that there are no conflicts of interest regarding the publication of this paper.