Abstract
Optimization problems can become intractable when the search space undergoes tremendous growth. Heuristic optimization methods have therefore been created that can search the very large spaces of candidate solutions. These methods, also called metaheuristics, are the general skeletons of algorithms that can be modified and extended to suit a wide range of optimization problems. Various researchers have invented a collection of metaheuristics inspired by the movements of animals and insects (e.g., firefly, cuckoos, bats and accelerated PSO) with the advantages of efficient computation and easy implementation. This paper studies a relatively new bio-inspired heuristic optimization algorithm called the Wolf Search Algorithm (WSA) that imitates the way wolves search for food and survive by avoiding their enemies. The WSA is tested quantitatively with different values of parameters and compared to other metaheuristic algorithms under a range of popular non-convex functions used as performance test problems for optimization algorithms, with superior results observed in most tests.
Similar content being viewed by others
References
Özcan E, Basaran C (2009) A Case Study of Memetic Algorithms for Constraint Optimization. Soft Comput Fusion Found Methodol Appl 13(8–9):871–882
Yang X-S (2009) Firefly algorithms for multimodal optimization. Stochastic algorithms: foundations and applications, SAGA 2009. Lecture notes in computer sciences, 5792. Springer, Heidelberg, pp 169–178
Yang X-S, Deb S (2009) Cuckoo search via Levy flights. In: World congress on nature and biologically inspired computing (NaBIC 2009). IEEE Publication, USA. 2009, pp 210–214
Yang X-S, Deb S, Fong S (2011) Accelerated particle swarm optimization and support vector machine for business optimization and applications, the third international conference on networked digital technologies (NDT 2011), Springer CCIS 136, Macau, 11–13 July 2011, pp 53–66
Yang X-S (2010) A new metaheuristic bat-inspired algorithm. In: Gonzalez JR et al (eds) Nature inspired cooperative strategies for optimization (NISCO 2010), vol 284., Studies in computational intelligenceSpringer, Berlin, pp 65–74
Peng Y (2011) An improved artificial fish swarm algorithm for optimal operation of cascade reservoirs. J Comput 6(4):740–746
Törn A, Zilinskas A (1991) Global Optimization. Lect Notes Comput Sci Parallel Comput 17:619–632
Golfberg D (1975) Genetic algorithms in search, optimization and machine learning. Addison-Wesley, Reading
Kalender M, Kheiri A, Özcan E, Burke EK (2013) A greedy gradient-simulated annealing selection hyper-heuristic. Soft Comput 17(12):2279–2292. doi:10.1007/s00500-013-1096-5
Dueck G, Scheuer T (1990) Threshold accepting: a general purpose optimization algorithm appearing superior to simulated annealing. J Comput Phys 90(1):161–175 Elsevier
Glover F (1989) Tabu search—part 1. ORSA J Comput 1(2):190–206
Dorigo M, Maniezzo V, Colorni A (1996) Ant system: optimization by a colony of cooperating agents. IEEE Trans Syst Man Cybern Part B 26(1):29–41
Acknowledgments
The authors are thankful for the financial support from the research grant “Adaptive OVFDT with Incremental Pruning and ROC Corrective Learning for Data Stream Mining,” Grant no. MYRG073(Y3-L2)-FST12-FCC, offered by the University of Macau, FST and RDAO.
Author information
Authors and Affiliations
Corresponding author
Appendix: fitness functions
Appendix: fitness functions
We used the following test functions to test our algorithms.
-
1.
Griewangk’s function: In this test function, there are a lot of local minima,
$$f(x) = \sum\limits_{i = 1}^{n} {\frac{{x_{i}^{2} }}{4000} - \prod\limits_{i = 1}^{n} {\cos \left( {\frac{{x_{i} }}{\sqrt i }} \right)} + 1}$$The global minimum is f min = 0 at (0,…,0), where the −600 < x i < 600.
-
2.
Sphere function:
$$f(x) = \sum\limits_{i = 1}^{d} {x_{i}^{2} } ,\quad {\text{where}}\;{\text{the}}\; - 5.12 < x_{i} < 5.12$$The global minimum is f min = 0 at (0,…,0).
-
3.
Rastrigin’s function: This function is difficult due to its large search space and large number of local minima.
$$f(x) = An + \sum\limits_{i = 1}^{n} {\left[ {x_{i}^{2} - A\cos (2\pi x_{i} )} \right]}$$where the A = 10, x i ∊ [−5.12, 5.12] and the global minimum is f min = 0 at (0,…,0).
-
4.
Schaffer F6:
$$f(x) = 0.5 + \frac{{\sin^{2} \sqrt {x^{2} + y^{2} } - 0.5}}{{(1 + 0.001 \times (x^{2} + y^{2} ))^{2} }}$$where x i ∊ [10, 10] and the global minimum is f min = 0 at (0,…,0).
-
5.
Moved axis parallel hyperellipsoid function:
$$f(x) = \sum\limits_{i = 1}^{n} {5i \cdot x_{i}^{2} }$$where x i ∊ [−5.12, 5.12]and the global minimum is f min = 0 at (0,…,0).
-
6.
The third Bohachevsky function:
$$f(x) = x_{1}^{2} + 2x_{2}^{2} - 0.3\cos (3\pi x_{1} ) + 0.3\cos (4\pi x_{2} ) + 0.3$$This function only has two variables, where x ∈ [−10, 10], the global minimum is f min = −0.24 located in [−0.24,0], [0,0.24].
-
7.
Michalewicz’s function:
$$f(x) = - \sum\limits_{i = 1}^{d} {\sin (x_{i} )\left[ {\sin \left( {\frac{{ix_{i}^{2} }}{\pi }} \right)} \right]}^{2m}$$where m = 10. x i ∊ [0, π]. When d = 2, the global minimum is f min = −1.803 at position (2.0230, 2.0230).
-
8.
Rosenbrock function:
$$f(x) = \sum\limits_{i = 1}^{d - 1} {\left[ {(1 - x_{i} )^{2} + 100(x_{i + 1} - x_{i}^{2} )^{2} } \right]}$$where the global minimum is f min = 0 at positions (1,…1).
Rights and permissions
About this article
Cite this article
Fong, S., Deb, S. & Yang, XS. A heuristic optimization method inspired by wolf preying behavior. Neural Comput & Applic 26, 1725–1738 (2015). https://doi.org/10.1007/s00521-015-1836-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-015-1836-9