Skip to main content
Log in

A heuristic optimization method inspired by wolf preying behavior

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Optimization problems can become intractable when the search space undergoes tremendous growth. Heuristic optimization methods have therefore been created that can search the very large spaces of candidate solutions. These methods, also called metaheuristics, are the general skeletons of algorithms that can be modified and extended to suit a wide range of optimization problems. Various researchers have invented a collection of metaheuristics inspired by the movements of animals and insects (e.g., firefly, cuckoos, bats and accelerated PSO) with the advantages of efficient computation and easy implementation. This paper studies a relatively new bio-inspired heuristic optimization algorithm called the Wolf Search Algorithm (WSA) that imitates the way wolves search for food and survive by avoiding their enemies. The WSA is tested quantitatively with different values of parameters and compared to other metaheuristic algorithms under a range of popular non-convex functions used as performance test problems for optimization algorithms, with superior results observed in most tests.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Özcan E, Basaran C (2009) A Case Study of Memetic Algorithms for Constraint Optimization. Soft Comput Fusion Found Methodol Appl 13(8–9):871–882

    Google Scholar 

  2. Yang X-S (2009) Firefly algorithms for multimodal optimization. Stochastic algorithms: foundations and applications, SAGA 2009. Lecture notes in computer sciences, 5792. Springer, Heidelberg, pp 169–178

  3. Yang X-S, Deb S (2009) Cuckoo search via Levy flights. In: World congress on nature and biologically inspired computing (NaBIC 2009). IEEE Publication, USA. 2009, pp 210–214

  4. Yang X-S, Deb S, Fong S (2011) Accelerated particle swarm optimization and support vector machine for business optimization and applications, the third international conference on networked digital technologies (NDT 2011), Springer CCIS 136, Macau, 11–13 July 2011, pp 53–66

  5. Yang X-S (2010) A new metaheuristic bat-inspired algorithm. In: Gonzalez JR et al (eds) Nature inspired cooperative strategies for optimization (NISCO 2010), vol 284., Studies in computational intelligenceSpringer, Berlin, pp 65–74

    Chapter  Google Scholar 

  6. Peng Y (2011) An improved artificial fish swarm algorithm for optimal operation of cascade reservoirs. J Comput 6(4):740–746

    Article  Google Scholar 

  7. Törn A, Zilinskas A (1991) Global Optimization. Lect Notes Comput Sci Parallel Comput 17:619–632

    Google Scholar 

  8. Golfberg D (1975) Genetic algorithms in search, optimization and machine learning. Addison-Wesley, Reading

    Google Scholar 

  9. Kalender M, Kheiri A, Özcan E, Burke EK (2013) A greedy gradient-simulated annealing selection hyper-heuristic. Soft Comput 17(12):2279–2292. doi:10.1007/s00500-013-1096-5

    Article  Google Scholar 

  10. Dueck G, Scheuer T (1990) Threshold accepting: a general purpose optimization algorithm appearing superior to simulated annealing. J Comput Phys 90(1):161–175 Elsevier

    Article  MathSciNet  MATH  Google Scholar 

  11. Glover F (1989) Tabu search—part 1. ORSA J Comput 1(2):190–206

    Article  MATH  Google Scholar 

  12. Dorigo M, Maniezzo V, Colorni A (1996) Ant system: optimization by a colony of cooperating agents. IEEE Trans Syst Man Cybern Part B 26(1):29–41

    Article  Google Scholar 

Download references

Acknowledgments

The authors are thankful for the financial support from the research grant “Adaptive OVFDT with Incremental Pruning and ROC Corrective Learning for Data Stream Mining,” Grant no. MYRG073(Y3-L2)-FST12-FCC, offered by the University of Macau, FST and RDAO.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Simon Fong.

Appendix: fitness functions

Appendix: fitness functions

We used the following test functions to test our algorithms.

  1. 1.

    Griewangk’s function: In this test function, there are a lot of local minima,

    $$f(x) = \sum\limits_{i = 1}^{n} {\frac{{x_{i}^{2} }}{4000} - \prod\limits_{i = 1}^{n} {\cos \left( {\frac{{x_{i} }}{\sqrt i }} \right)} + 1}$$

    The global minimum is f min = 0 at (0,…,0), where the −600 < x i  < 600.

  2. 2.

    Sphere function:

    $$f(x) = \sum\limits_{i = 1}^{d} {x_{i}^{2} } ,\quad {\text{where}}\;{\text{the}}\; - 5.12 < x_{i} < 5.12$$

    The global minimum is f min = 0 at (0,…,0).

  3. 3.

    Rastrigin’s function: This function is difficult due to its large search space and large number of local minima.

    $$f(x) = An + \sum\limits_{i = 1}^{n} {\left[ {x_{i}^{2} - A\cos (2\pi x_{i} )} \right]}$$

    where the A = 10, x i  ∊ [−5.12, 5.12] and the global minimum is f min = 0 at (0,…,0).

  4. 4.

    Schaffer F6:

    $$f(x) = 0.5 + \frac{{\sin^{2} \sqrt {x^{2} + y^{2} } - 0.5}}{{(1 + 0.001 \times (x^{2} + y^{2} ))^{2} }}$$

    where x i  ∊ [10, 10] and the global minimum is f min = 0 at (0,…,0).

  5. 5.

    Moved axis parallel hyperellipsoid function:

    $$f(x) = \sum\limits_{i = 1}^{n} {5i \cdot x_{i}^{2} }$$

    where x i  ∊ [−5.12, 5.12]and the global minimum is f min = 0 at (0,…,0).

  6. 6.

    The third Bohachevsky function:

    $$f(x) = x_{1}^{2} + 2x_{2}^{2} - 0.3\cos (3\pi x_{1} ) + 0.3\cos (4\pi x_{2} ) + 0.3$$

    This function only has two variables, where x ∈ [−10, 10], the global minimum is f min = −0.24 located in [−0.24,0], [0,0.24].

  7. 7.

    Michalewicz’s function:

    $$f(x) = - \sum\limits_{i = 1}^{d} {\sin (x_{i} )\left[ {\sin \left( {\frac{{ix_{i}^{2} }}{\pi }} \right)} \right]}^{2m}$$

    where m = 10. x i  ∊ [0, π]. When d = 2, the global minimum is f min = −1.803 at position (2.0230, 2.0230).

  8. 8.

    Rosenbrock function:

    $$f(x) = \sum\limits_{i = 1}^{d - 1} {\left[ {(1 - x_{i} )^{2} + 100(x_{i + 1} - x_{i}^{2} )^{2} } \right]}$$

    where the global minimum is f min = 0 at positions (1,…1).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fong, S., Deb, S. & Yang, XS. A heuristic optimization method inspired by wolf preying behavior. Neural Comput & Applic 26, 1725–1738 (2015). https://doi.org/10.1007/s00521-015-1836-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-015-1836-9

Keywords

Navigation