Benchmarking RCGAu on the Noiseless BBOB Testbed

RCGAu is a hybrid real-coded genetic algorithm with “uniform random direction” search mechanism. The uniform random direction search mechanism enhances the local search capability of RCGA. In this paper, RCGAu was tested on the BBOB-2013 noiseless testbed using restarts till a maximum number of function evaluations (#FEs) of 105 × D are reached, where D is the dimension of the function search space. RCGAu was able to solve several test functions in the low search dimensions of 2 and 3 to the desired accuracy of 108. Although RCGAu found it difficult in getting a solution with the desired accuracy 108 for high conditioning and multimodal functions within the specified maximum #FEs, it was able to solve most of the test functions with dimensions up to 40 with lower precisions.


Introduction
The simple genetic algorithm (GA) introduced by Holland is a probabilistic algorithm based on the theory of natural selection by Charles Darwin. GA mimics the evolutionary process through the creation of variations in each generation and the survival of the fittest individuals through the blending of genetic traits. Individuals with genetic traits that increase their probability of survival will be given more opportunities to reproduce and their offspring will also profit from the heritable traits. Over the period of time these individuals will eventually dominate the population [1,2].
GA consists of a set of potential solutions called chromosomes, a selection operator, a crossover operator, and a mutation operator. A chromosome is a string of zeros (0s) and ones (1s). It is a metaphor of the biological chromosome in living organisms. The zeros (0s) and ones (1s) are called genes. A gene is the transfer unit of heredity. It contains genetic traits or information that is passed on from a parent solution to its offspring. The selection operator selects solutions for mating based on the principle of "survival of the fittest. " The crossover operator generates new solution pairs called children by combining the genetic materials of the selected parents. The mutation operator is an exploratory operator that is applied, with low probability, to the population of chromosomes to sustain diversity. Without the mutation operator, GAs can easily fall into premature convergence [1,3].
The simple GA was designed to work on binary strings and it is directly applicable to pseudoboolean objective functions. However, most real life problems are represented as continuous parameter optimization problems. A decoding function was designed to map the solutions from binary space to the real-valued space. This decoding process can become prohibitively expensive for binary string GAs especially when the problem dimension increases [1,3]. To tackle this problem real-coded genetic algorithms were introduced [4].
Real-coded genetic algorithms (RCGAs) use real-valued vectors to represent individual solutions. Surveys show that several variants of RCGAs have been proposed and used to solve a wide range of real life optimization problems. Some recent examples can be found in [1,[4][5][6][7][8][9][10].
In this paper, a set of noiseless testbed from the blackbox optimization benchmarking (BBOB) 2013 workshop is used to benchmark RCGAu, a hybrid real-coded genetic algorithm that consists of "uniform random direction" local search technique.
The RCGAu algorithm is presented in Section 2, Section 3 provides the CPU timing for the experiments, Section 4 presents the results and discussion, and finally Section 5 concludes the paper with some recommendations.

The RCGAu Algorithm
RCGAu is a hybrid RCGA with a simple derivative-free local search technique called "uniform random direction" local search method. The local search technique operates on all individuals after the mutation operator has been applied to the population of individuals.
The RCGAu used in this work is a modified version of the RCGAu used in [16,17]. It consists of five major operators, namely, tournament selection, blend-crossover, nonuniform mutation, uniform random direction local search method, and a stagnation alleviation mechanism. Algorithm 1 shows the RCGAu algorithm.
The notations used in this paper are defined as follows. denotes the population of individual solutions , at time , is the size of , ( ( )) represents the standard deviation of the fitness values ( ) of all solutions , , ∈ , is the mating pool containing the parent solutions, is the population of offspring solutions obtained after applying crossover on the parents in̂, is the crossover probability, is the resultant population of solutions after applying mutation on , is the mutation probability, and Υ is the population of solutions obtained after ulsearch has been applied to , where ulsearch denotes the uniform random direction local search. Also, = 10 −12 , a very small positive value [18].
The evolutionary process in Algorithm 1 starts by initializing =0 from the search space ⊂ R . The domain of is defined by specifying upper ( ) and lower ( ) limits of each th component of ; that is, ≤ ≤ and , ∈ R, = 1, 2, . . . , . Next, the fitness value ( , ), ∀ , ∈ 0 , is calculated and the population diversity of is measured by calculating the standard deviation ( ( )) of ( , ).
If ( ( )) ≤ and the global optimum has not been found, then 90% of is refreshed with newly generated solutions using the function perturb ( ). is refreshed by sorting the solutions according to their fitness values and preserving the top 10% of . The remaining 90% of are replaced with uniformly generated random values from the interval [−4, 4] and the resultant population;̂= { 1, , 2, , . . . , , } is created. is the size of the mating pool and ≤ . If, on the other hand, ( ( )) > then tournament selection is applied on to create an equivalent mating pool̂.
The tournament selection scheme works by selecting number of solutions uniformly at random from , where is the tournament size and < . The selected individuals are compared using their fitness values and the best individual is selected and assigned tô. This procedure is repeated times to populatê.
The new pair ( 1, , 2, ) is then copied to the set ; otherwise the pair ( , , , ) is copied to .    The Scientific World Journal  Then the nonuniform mutation [4] is applied to the components of each member of with probability, , as follows : where is a uniformly distributed random number in the interval [0, 1]. and are the upper and lower boundaries of ∈ , respectively. The function Δ( , − , ) given below takes a value in the interval [0, ]: where is a uniformly distributed random number in the interval [0, 1], is the maximum number of generations, and is a parameter that determines the nonuniform strength of the mutation operator. The mutated individual , is then copied to the set ; otherwise , is copied to .  Then ulsearch is applied on each solution , ∈ with the aim of performing local searches around the neighborhood of each solution. ulsearch works by randomly selecting a solution , ∈ and creating a trial point , using where Δ is a step size parameter and = ( 1 , 2 , . . . , ) is a directional cosine with random components where ∼ Unif([−1, 1]). There are cases when the components of the trial point , = ( 1 , , 2 , , . . . , , ) generated by (4) fall outside the search space during the search. In these cases, the components of , are regenerated using where ∼ Unif([0, 1]) and , is the corresponding component of the randomly selected solution , ∈ . The step size parameter, Δ , is initialized at time = 0 according to [15,16] by where ∈ [0,1]. The idea of using (7) to generate the initial step length is to accelerate the search by starting with a suitably large step size to quickly traverse the search space and as the search progresses the step size is adaptively adjusted at the end of each generation, , by where is the number of Euclidean distances { 1 , 2 , . . . , } between nearest points to the mean and of a set of randomly selected distinct points Ω = { 1 , 2 , . . . , } ⊂ . After the trial point , ∈ Υ has been created, it is evaluated and compared with , . If , < , , then , ∈ Υ is used to replace , ∈ ; otherwise the search direction is changed by changing the sign of the step length. The new step length is used to recalculate a new trial point. After a new trial point has been recalculated and evaluated, it is used to replace , ∈ with , , if , < , ; otherwise , ∈ is retained.
At the end of ulsearch, is updated with to form +1 and elitism is used to replace the worst point in +1 with the best point in because the generational model is the replacement strategy adopted in this work [19].

Experimental Procedure and Parameter Settings
The experimental setup was carried out according to [20] on the benchmark functions provided in [21,22]. Two independent restart strategies were used to restart RCGAu whenever stagnates or when the maximum number of generations is exceeded and target is not found. For each restart strategy, the experiment is reinitialized with an initial population 0 which is uniformly and randomly sampled from the search space [−4, 4] [6,18].
Two stopping conditions used for the restart strategies are as follows.
(i) A test for stagnation is carried out to check if the best solution obtained so far did not vary by more than 10 −12 during the last (50 + 25 × ) generations as in [6].
The Scientific World Journal 7 Separable fcts.
Weak structure fcts.  Table 2 for details). Each cross (+) represents a single function and the line is the geometric mean.

8
The Scientific World Journal   (7) 90 (17) 148 (19) 218 (28) 294 ( Table 2: ERT loss ratio versus the budget (both in number of -evaluations divided by dimension). The target value for a given budget FEvals is the best target -value reached within the budget by the given algorithm. Shown is the ERT of the given algorithm divided by best ERT seen in GECCO-BBOB-2009 for the target , or, if the best algorithm reached a better target within the budget, the budget divided by the best ERT. Line: geometric mean. Box-Whisker error bar: 25-75%-ile with median (box), 10-90%-ile (caps), and minimum and maximum ERT loss ratio (points). The vertical line gives the maximal number of function evaluations in a single trial in this function subset. See also Figure 3 for results on each function subgroup.  (ii) maximum number of evaluations #FEs = 10 5 × ; (iii) tournament size = 3; (iv) crossover rate = 0.8; (v) mutation rate = 0.15; (vi) nonuniformity factor for the mutation = 15; (vii) elitism = 1; (viii) crafting effort CrE = 0 [20].

CPU Timing Experiment
The CPU timing experiment was conducted for RCGAu using the same independent restart strategies on the function 8 for a duration of 30 seconds on an AMD Turion (tm) II Ultra Dual-Core Mobile 620 CPU processor, running at 2.50 GHz under a 32-bit Microsoft Windows 7 Professional service pack 1 with 2.75 GB RAM usable and Matlab 7.10 ( 2010 ).

Results
The results of the empirical experiments conducted on RCGAu according to [20] on the benchmark functions given in [21,22] are presented in Figures 1, 2, and 3 and in Tables 1  and 2. Figure 1 shows the performance of RCGAu on all the noiseless problems with the dimensions 2, 3, 5, 10, 20, and 40. RCGAu was able to solve many test functions in the low search dimensions of 2 and 3 to the desired accuracy of 10 8 . It is able to solve most test functions with dimensions up to 40 at lowest precision of 10 1 .
Although RCGAu found it difficult in getting a solution with the desired accuracy 10 8 for high conditioning and multimodal functions within the specified maximum #FEs it was able to solve 21 with dimensions up to 40, 1 and 2 with dimensions up to 20, 3 and 7 with dimensions up to 10, and 4 , 6 , 15 , 20 , and 22 with dimensions up to 5.
In Figure 2, the left subplot graphically illustrates the empirical cumulative distribution function (ECDF) of the number of function evaluations divided by the dimension of the search space, while the right subplot shows the ECDF of the best achieved Δ . This figure graphically shows the performance of RCGAu in terms of function evaluation. Table 1 presents the performance of RCGAu in terms of the expected running time (ERT). This measure estimates the run time of RCGAu by using the number of function evaluations divided by the best ERT measured during BBOB 2009 workshop. This benchmark shows that RCGAu needs some improvement in terms of performance.

Conclusion
The performance of RCGAu on the suite of noiseless blackbox optimization testbed has been average on a number of problems but it has excelled in solving functions 1 , 2 , 3 , 7 , and 21 . Studies have currently been carried out to find out why RCGAs do not efficiently solve highly conditioned problems. Further modifications to RCGAs are needed to exploit the full strength of evolutionary processes.