Skip to main content

Advertisement

Log in

Non-revisiting genetic algorithm with adaptive mutation using constant memory

  • Regular Research Paper
  • Published:
Memetic Computing Aims and scope Submit manuscript

Abstract

The continuous non-revisiting genetic algorithm (cNrGA) uses the entire search history and parameter-less adaptive mutation to significantly enhance search performance. Storing the entire search history is natural and costs little when the number of fitness evaluations is small or moderate. However, if the number of evaluations required is substantial, some memory management is desirable. In this paper, we propose two pruning mechanisms to keep the memory used constant. They are least recently used pruning and random pruning. The basic idea is to prune a unit of memory when the memory threshold is reached and some new search information is required to be stored, thus keeping the overall memory used constant. Meanwhile, both pruning strategies naturally form parameter-less adaptive mutation operators. A study is carried out to evaluate the impact on performance caused by loss of search history information. Experimental results show that (1) both strategies can maintain the performance of cNrGA, up to the empirical limit when 90 % of the search history is not recorded, (2) cNrGA and its variants with constant memory outperform the real-coded genetic algorithm and the standard particle swarm optimization. By pre-extracting all the current prune-able history information and storing them into a list, namely, to-prune-list, the overhead of both pruning strategies becomes small. This suggests that cNrGA can be extended to use in situations when the number of fitness evaluations is much larger than before with no significant effect on statistical performance. This widens the applicability of cNrGA to include more practical problems that require larger number of fitness evaluations before converging to the global optima.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82

    Article  Google Scholar 

  2. Davis L (ed) (1991) Handbook of genetic algorithms, vol 115. Van Nostrand Reinhold, New York

  3. Friedrich T, Hebbinghaus N, Neumann F (2007) Rigorous analyses of simple diversity mechanisms. In: Proceedings of the 9th annual conference on genetic and evolutionary computation (GECCO). ACM, New York, pp 1219–1225

  4. Ronald S (1998) Duplicate genotypes in a genetic algorithm. In: Proceedings of the IEEE world congress on computational intelligence (WCCI), pp 793–798

  5. Povinelli RJ, Feng X (1999) Improving genetic algorithms performance by hashing fitness values. In: Proceedings of the artificial neural networks in engineering (ANNIE), pp 399–404

  6. Kratica J (1999) Improving performances of the genetic algorithm by caching. Comput Artif Intell 18(3):271–283

    MATH  Google Scholar 

  7. Yuen SY, Chow CK (2009) A genetic algorithm that adaptively mutates and never revisits. IEEE Trans Evol Comput 13(2):454–472

    Article  Google Scholar 

  8. Glover F, Laguna M (1997) Tabu search. Kluwer, Norwell

    Book  MATH  Google Scholar 

  9. Chow CK, Yuen SY (2010) Continuous non-revisiting genetic algorithm with random search space re-partitioning and one-gene-flip mutation. In: Proceedings of the IEEE congress on evolutionary computation (CEC) , Barcelona. doi:10.1109/CEC.2012.6252926

  10. Chow CK, Yuen SY (2012) Continuous Non-revisiting Genetic Algorithm with Overlapped Search Sub-region. In Proceedings of the IEEE congress on evolutionary computation (CEC), Brisbane, QLD, p 1–8. doi:10.1109/CEC.2010.5586046

  11. Yuen SY, Chow CK (2008) A non-revisiting simulated annealing algorithm. In: Proceedings of the IEEE congress on evolutionary computation (CEC), pp 1886–1892

  12. Chow CK, Yuen SY (2008) A non-revisiting particle swarm optimization. In: Proceedings of the IEEE congress on evolutionary computation (CEC), pp 1879–1885

  13. Du J, Rada R (2012) Memetic algorithms, domain knowledge, and financial investing. Memet Comput 4(2):109–125

    Article  Google Scholar 

  14. Young CN, LeBrese C, Zou JJ, Leo CJ (2013) A robust search paradigm with enhanced vine creeping optimization. Eng Optim 45(2):225–244

    Article  MathSciNet  Google Scholar 

  15. Akay B, Karaboga D (2012) A modified artificial bee colony algorithm for real-parameter optimization. Inf Sci 192:120–142

    Article  Google Scholar 

  16. Jadon SS, Bansal JC, Tiwari R, Sharma H (2015) Accelerating artificial bee colony algorithm with adaptive local search. Memet Comput 7(3):215–230

    Article  Google Scholar 

  17. Gandomi AH, Yang XS (2012) Evolutionary boundary constraint handling scheme. Neural Comput Appl 21:1449–1462

    Article  Google Scholar 

  18. Wang Y, Li HX, Huang T, Li L (2014) Differential evolution based on covariance matrix learning and bimodal distribution parameter setting. Appl Soft Comput 18:232–247

    Article  Google Scholar 

  19. Wang Y, Wang BC, Li HX, Yen GG (2015) Incorporating objective function information into the feasibility rule for constrained evolutionary optimization. IEEE Trans Cybern (in press). doi:10.1109/TCYB.2015.2493239

  20. Chu W, Gao X, Sorooshian S (2011) Handling boundary constraints for particle swarm optimization in high-dimensional search space. Inf Sci 181:4569–4581

    Article  Google Scholar 

  21. Zambrano-Bigiarini M, Clerc M, Rojas R (2013) Standard particle swarm optimisation 2011 at CEC-2013: a baseline for future PSO improvements. In: Proceedings of the IEEE congress on evolutionary computation (CEC), pp 2337–2344

  22. Hansen N (2006) The CMA evolution strategy: a comparing review. In: Proceedings on towards a new evolutionary computation, pp 75–102

  23. Hansen N (2011) The CMA evolutionary strategy: a tutorial. In: Technical report. http://www.lri.fr/~hansen/cmatutorial.pdf. Accessed 14 June 2015

  24. Mack CA (2011) Fifty years of Moore’s law. IEEE Trans Semicond Manuf 24(2):202–207

    Article  Google Scholar 

  25. Lou Y, Yuen SY (2015) Non-revisiting genetic algorithm with constant memory. In: Proceedings of the IEEE systems, man, and cybernetics (SMC), pp 1714–1719

  26. Eshelman LJ, Schaffer JD (1992) Real-coded genetic algorithms and interval-schemata. In: Proceedings of the international conference on genetic algorithms (ICGA), pp 187–202

  27. Lihu A, Holban S, Popescu O-A (2012) Real-valued genetic algorithms with disagreements. Memet Comput 4(4):317–325

    Article  Google Scholar 

  28. Heris SMK (2015) Implementation of real-coded genetic algorithm in MATLAB. http://www.yarpiz.com. Accessed 23 Aug 2015

  29. Chow CK, Yuen SY (2011) An evolutionary algorithm that makes decision based on the entire previous search history. IEEE Trans Evol Comput 15(6):741–769

    Article  Google Scholar 

  30. Leung SW, Yuen SY, Chow CK (2012) Parameter control system of evolutionary algorithm that is aided by the entire search history. Appl Soft Comput 12(9):3063–3078

    Article  Google Scholar 

  31. www.ee.cityu.edu.hk/~syyuen/Public/Code.html. Accessed 28 Dec 2015

  32. Liang JJ, Qu B-Y, Suganthan PN, Hernández-Díaz AG (2013) Problem definitions and evaluation criteria for the CEC 2013 special session and competition on real-parameter optimization. In: Technical report 2012, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou and technical report, Nanyang Technological University, Singapore

  33. Karafotias G, Hoogendoorn M, Eiben AE (2014) Parameter control in evolutionary algorithms: trends and challenges. IEEE Trans Evol Comput 19(2):167–187

    Article  Google Scholar 

  34. Sedgewick R (2002) Algorithms in Java, parts 1–4. Addison-Wesley, Boston

  35. Knuth DE (1998) The art of computer programming: sorting and searching. Pearson Education, London

Download references

Acknowledgments

The work described in this paper was supported by a grant from the Research Grants Council of the Hong Kong Special Administrative Region, China (Project No. CityU 125313). We thank Dr. Chi Kin Chow for suggesting that pruning can be done randomly on the discrete version of NrGA.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shiu Yin Yuen.

Appendix

Appendix

See Tables 5, 6, 7 and 8.

Table 5 Performance comparison of cNrGA, cNrGA/CM/LRU and cNrGA/CM/R. The best mean fitness values over three algorithms are shaded in grey. \(\dag \) means cNrGA is significantly superior to its variant with constant memory, while \(\ddag \) means cNrGA is significantly inferior
Table 6 Performance comparison of cNrGA, cNrGA/CM/LRU and cNrGA/CM/R. The best mean fitness values over three algorithms are shaded in grey. \(\dag \) means cNrGA is significantly superior to its variant with constant memory, while \(\ddag \) means cNrGA is significantly inferior
Table 7 Performance comparison of cNrGA, cNrGA/CM/LRU and cNrGA/CM/R. The best mean fitness values over three algorithms are shaded in grey. \(\dag \) means cNrGA is significantly superior to its variant with constant memory, while \(\ddag \) means cNrGA is significantly inferior
Table 8 Performance comparison of cNrGA, cNrGA/CM/LRU and cNrGA/CM/R. The best mean fitness values over three algorithms are shaded in grey. \(\dag \) means cNrGA is significantly superior to its variant with constant memory, while \(\ddag \) means cNrGA is significantly inferior
Table 9 Mean results (mean) and standard deviation ( SD) of cNrGA/CM/LRU cNrGA/CM/R, real-coded GA and SPSO 2011
Table 10 Significance tests of the three variants of cNrGA vs. real-coded GA and SPSO 2011 [\(+\) means cNrGA (or its variant) is significantly superior, while \(-\) means significantly inferior]

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lou, Y., Yuen, S.Y. Non-revisiting genetic algorithm with adaptive mutation using constant memory. Memetic Comp. 8, 189–210 (2016). https://doi.org/10.1007/s12293-015-0178-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12293-015-0178-6

Keywords

Navigation