Elsevier

Applied Soft Computing

Volume 66, May 2018, Pages 473-491
Applied Soft Computing

Cat swarm optimization with normal mutation for fast convergence of multimodal functions

https://doi.org/10.1016/j.asoc.2018.02.012Get rights and content

Highlights

  • A novel normal mutation strategy based cat swarm optimization is proposed.

  • Twenty complex test functions are used to evaluate the accuracy of the proposed method.

  • NMCSO provides the global optimum for most of the multimodal problems with faster convergence rate.

  • The numerical results illustrate that the NMCSO is quite superior to some of the state of the art evolutionary algorithms.

  • The proposed method works well for the higher dimensional problems also.

Abstract

A normal mutation strategy based cat swarm optimization (NMCSO) that features effective global search capabilities with accelerating convergence speed is presented. The classical CSO suffers from the premature convergence and gets easily trapped in the local optima because of the random mutation process. This frailty has restricted wider range of applications of the classical CSO. To overcome the drawbacks, the normal mutation is adopted in the mutation process of this paper. It enables the cats to seek the positions in better directions by avoiding the problem of premature convergence and local optima. Experiments are conducted on several benchmark unimodal, rotated, unrotated and shifted multimodal problems to demonstrate the effectiveness of the proposed method. Furthermore, NMCSO is also applied to solve the large parameter optimization problems. The experimental results illustrate that the proposed method is quite superior to classical CSO, particle swarm optimization (PSO) and some of the state of the art evolutionary algorithms in terms of convergence speed, global optimality, solution accuracy and algorithm reliability.

Introduction

Optimization is essential in order to design, construct and maintain the various engineering systems. Although the optimization methods originated in the days of Newton, Lagrange and Cauchy through the formulation of analytical optimization methods, these analytical methods cannot solve all types of complex optimization problems. Such analytical methods tend to converge to local minima when solving multimodal optimization problems. As the complexity of the system increases, more efficient and less computationally intensive optimization methods are necessary.

Over the last few decades, evolutionary algorithms such as genetic algorithm (GA) [1], particle swarm optimization (PSO) [2], evolutionary programming (EP) [3], differential evolution (DE) [4], ant colony optimization (ACO) [5], invasive weed optimization (IWO) [6], cat swarm optimization (CSO) [[7], [8]] etc. and their variants [[9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25]] have been successfully introduced to solve numerous complex optimization problems.

But most of the algorithms suffer from the “imprecation of dimensionality”, which implies that the performance of the algorithm decreases as the dimensionality over the solution landscape increases. As the dimensionality increases, many state of the art evolutionary algorithms get trapped in local optima. This problem is exceptionally severe for multimodal problems with higher dimensions [[26], [27], [28], [29]].

Another issue to be considered while using the optimization techniques is the convergence speed. Most of the algorithms suffer from relatively slow convergence rates. It is significantly harder to find the global optimum with faster convergence in the case of a high dimensional problem as compared to a low dimensional problem. So it is imperative to develop novel algorithms to achieve more solution accuracy with fast convergence rate in higher dimensional problems.

In order to achieve solution accuracy and fast convergence rate simultaneously, a variant of the CSO algorithm is proposed. A normal mutation strategy [[30], [31]], (NMCSO) is introduced to the CSO algorithm. The CSO is a high performance computational method, inspired from the natural behavior of cats. It was introduced by Chu and Tsai in 2007 [7]. It has been applied to different engineering problems [[22], [23], [24], [25]] and has shown good performance as compared to many of the well-known evolutionary algorithms. The introduction of the normal mutation operator enhances the convergence rate and solution accuracy of the CSO algorithm in solving high dimensional multimodal problems.

The paper is organized as follows. Section 2 presents a detailed description of the proposed algorithm. Section 3 presents the search behavior of NMCSO and classical CSO. A detailed analysis of the optimization of various test functions using the proposed method is provided in Section 4 and lastly, Section 5 highlights the major benefits of the proposed algorithm.

Section snippets

Traditional CSO

Cat swarm optimization is modelled by identifying specific characteristic features of a cat’s behavior. The features are termed as seeking mode and tracing mode. Thus there are two modes of operation of the CSO: the seeking mode and the tracing mode. The cats are distributed to the two modes based on mixture ratio (MR).

Search behavior of NMCSO and classical CSO

To illustrate the search behavior of NMCSO and classical CSO, two 2-D multimodal functions, namely Rastrigin (fR(x)) and rotated Griewank (fG(x)) are considered. The search behavior is observed by noting the distribution of cats at various stages of the evolutionary process.

The functions used for this illustration are as followsfR(x)=i=12(xi210cos(2πxi)+10)xi[5.125.12],fG(x)=i=12yi24000i=12cos(yii)+1,y=O*xxi[600600]

where O is an orthogonal matrix. The first function is an unrotated

Test functions

In this communication, NMCSO is applied to minimize a set of sixteen benchmark problems which are listed in Table 1 [[18], [35]]. These benchmark functions are widely adopted while evaluating the performance of global optimization algorithms. These functions are divided into three groups according to their properties. The groups known as unimodal, unrotated multimodal and rotated multimodal problems. The properties and the formulas of these benchmark functions are given in Table 1. Table 1

Conclusions

In this paper, we have proposed normal mutation strategy based CSO to solve complex multimodal problems. The new strategy guides the cats to seek the better positions in most efficient and promising way. Also, it makes the CSO algorithm explore its global search abilities and convergence characteristics to a greater extent. It must be noted that, we have not introduced any complex variations to the original CSO structure. The only difference is the position updated equation in seeking mode

Acknowledgement

This work was supported by Indian Institute of Technology, Bhubaneswar and MHRD, India.

References (39)

  • R. Storn et al.

    Differential evolution: a simple and efficient heuristic for global optimization over continuous spaces

    J. Glob. Optim.

    (1995)
  • M. Dorigo et al.

    Ant system: optimization by a colony of cooperating agents

    IEEE Trans. Syst. Man Cybern. Part B

    (1996)
  • Shu-Chuan Chu et al.
    (2006)
  • Shu-Chuan Chu et al.

    Computational Intelligence based on the behavior of cats

    Int. J. Innov. Comput. Inf. Control

    (2007)
  • Y.-W. Leung et al.

    An orthogonal genetic algorithm with quantization for global numerical optimization

    IEEE Trans. Evol. Comput.

    (2001)
  • J. Kennedy et al.

    Swarm Intelligence

    (2001)
  • F. van den Bergh et al.

    A cooperative approach to particle swarm optimization

    IEEE Trans. Evol. Comput.

    (2004)
  • J.J. Liang et al.

    Comprehensive learning particle swarm optimizer for global optimization of multimodal functions

    IEEE Trans. Evol. Comput.

    (2006)
  • Zhi- Hui Zhan et al.

    Orthogonal learning particle swarm optimization

    IEEE Trans. Evol. Comput.

    (2011)
  • Cited by (0)

    View full text