Cat swarm optimization with normal mutation for fast convergence of multimodal functions
Graphical abstract
Introduction
Optimization is essential in order to design, construct and maintain the various engineering systems. Although the optimization methods originated in the days of Newton, Lagrange and Cauchy through the formulation of analytical optimization methods, these analytical methods cannot solve all types of complex optimization problems. Such analytical methods tend to converge to local minima when solving multimodal optimization problems. As the complexity of the system increases, more efficient and less computationally intensive optimization methods are necessary.
Over the last few decades, evolutionary algorithms such as genetic algorithm (GA) [1], particle swarm optimization (PSO) [2], evolutionary programming (EP) [3], differential evolution (DE) [4], ant colony optimization (ACO) [5], invasive weed optimization (IWO) [6], cat swarm optimization (CSO) [[7], [8]] etc. and their variants [[9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25]] have been successfully introduced to solve numerous complex optimization problems.
But most of the algorithms suffer from the “imprecation of dimensionality”, which implies that the performance of the algorithm decreases as the dimensionality over the solution landscape increases. As the dimensionality increases, many state of the art evolutionary algorithms get trapped in local optima. This problem is exceptionally severe for multimodal problems with higher dimensions [[26], [27], [28], [29]].
Another issue to be considered while using the optimization techniques is the convergence speed. Most of the algorithms suffer from relatively slow convergence rates. It is significantly harder to find the global optimum with faster convergence in the case of a high dimensional problem as compared to a low dimensional problem. So it is imperative to develop novel algorithms to achieve more solution accuracy with fast convergence rate in higher dimensional problems.
In order to achieve solution accuracy and fast convergence rate simultaneously, a variant of the CSO algorithm is proposed. A normal mutation strategy [[30], [31]], (NMCSO) is introduced to the CSO algorithm. The CSO is a high performance computational method, inspired from the natural behavior of cats. It was introduced by Chu and Tsai in 2007 [7]. It has been applied to different engineering problems [[22], [23], [24], [25]] and has shown good performance as compared to many of the well-known evolutionary algorithms. The introduction of the normal mutation operator enhances the convergence rate and solution accuracy of the CSO algorithm in solving high dimensional multimodal problems.
The paper is organized as follows. Section 2 presents a detailed description of the proposed algorithm. Section 3 presents the search behavior of NMCSO and classical CSO. A detailed analysis of the optimization of various test functions using the proposed method is provided in Section 4 and lastly, Section 5 highlights the major benefits of the proposed algorithm.
Section snippets
Traditional CSO
Cat swarm optimization is modelled by identifying specific characteristic features of a cat’s behavior. The features are termed as seeking mode and tracing mode. Thus there are two modes of operation of the CSO: the seeking mode and the tracing mode. The cats are distributed to the two modes based on mixture ratio (MR).
Search behavior of NMCSO and classical CSO
To illustrate the search behavior of NMCSO and classical CSO, two 2-D multimodal functions, namely Rastrigin () and rotated Griewank () are considered. The search behavior is observed by noting the distribution of cats at various stages of the evolutionary process.
The functions used for this illustration are as follows
where O is an orthogonal matrix. The first function is an unrotated
Test functions
In this communication, NMCSO is applied to minimize a set of sixteen benchmark problems which are listed in Table 1 [[18], [35]]. These benchmark functions are widely adopted while evaluating the performance of global optimization algorithms. These functions are divided into three groups according to their properties. The groups known as unimodal, unrotated multimodal and rotated multimodal problems. The properties and the formulas of these benchmark functions are given in Table 1. Table 1
Conclusions
In this paper, we have proposed normal mutation strategy based CSO to solve complex multimodal problems. The new strategy guides the cats to seek the better positions in most efficient and promising way. Also, it makes the CSO algorithm explore its global search abilities and convergence characteristics to a greater extent. It must be noted that, we have not introduced any complex variations to the original CSO structure. The only difference is the position updated equation in seeking mode
Acknowledgement
This work was supported by Indian Institute of Technology, Bhubaneswar and MHRD, India.
References (39)
- et al.
A novel numerical optimization algorithm inspired from weed colonization
Ecol. Inf.
(2006) - et al.
An improved particle swarm optimizer based on tabu detecting and local learning strategy in a shrunk search space
Appl. Soft Comput.
(2014) - et al.
IIR syatem identification using cat swarm optimization
Expert Syst. Appl.
(2011) - et al.
Enhanced parallel cat swarm optimization based on the taguchi method
Expert Syst. Appl.
(2012) - et al.
Optimizing least-significant-bit substitution using cat swarm optimization strategy
Inf. Sci.
(2012) - et al.
Linear antenna array synthesis using cat swarm optimization
Int. J. Electron. Commun. (AEU)
(2014) - et al.
A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms
Swarm Evolut. Comput.
(2011) Genetic Algorithms in Search, Optimization, and Machine Learning
(1989)- et al.
Particle swarm optimization
Proc IEEE Int. Conf. Neural Netw.
(1995) - et al.
Artificial Intelligence Through Simulated Evolution
(1966)