Skip to main content

Advertisement

Log in

Selective ensemble based on extreme learning machine and improved discrete artificial fish swarm algorithm for haze forecast

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Urban haze pollution is becoming increasingly serious, which is considered very harmful for humans by World Health Organization (WHO). Haze forecasts can be used to protect human health. In this paper, a Selective ENsemble based on an Extreme Learning Machine (ELM) and Improved Discrete Artificial Fish swarm algorithm (IDAFSEN) is proposed, which overcomes the drawback that a single ELM is unstable in terms of its classification. First, the initial pool of base ELMs is generated by using bootstrap sampling, which is then pre-pruned by calculating the pair-wise diversity measure of each base ELM. Second, partial-based ELMs among the initial pool after pre-pruning with higher precision and with greater diversity are selected by using an Improved Discrete Artificial Fish Swarm Algorithm (IDAFSA). Finally, the selected base ELMs are integrated through majority voting. The Experimental results on 16 datasets from the UCI Machine Learning Repository demonstrate that IDAFSEN can achieve better classification accuracy than other previously reported methods. After a performance evaluation of the proposed approach, this paper looks at how this can be used in haze forecasting in China to protect human health.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Gao LN, Jia GS, Zhang RJ et al (2011) Visibility trends in the Yangtze River Delta of China during 1981–2005. J Air Waste Manage 61:843–849

    Article  Google Scholar 

  2. Mishra D, Goyal P, Upadhyay A (2015) Artificial intelligence based approach to forecast PM2.5 during haze episodes: a case study of Delhi, India. Atmos Environ 102:239–248

    Article  Google Scholar 

  3. Deng J, Wang T, Jiang Z et al (2011) Characterization of visibility and its affecting factors over Nanjing, China. Atmos Res 101(3):681–691

    Article  Google Scholar 

  4. Wang T, Jiang F, Deng J et al (2012) Urban air quality and regional haze weather forecast for Yangtze River Delta region. Atmos Environ 58:70–83

    Article  Google Scholar 

  5. Zhang F, Chen J, Qiu T et al (2013) Pollution characteristics of PM2.5 during a typical haze episode in Xiamen, China. Atmos Clim Sci 3(4):427–439

    Google Scholar 

  6. McLaren J, Williams ID (2015) The impact of communicating information about air pollution events on public health. Sci Total Environ 538:478–491

    Article  Google Scholar 

  7. Li L, Lin GZ, Liu HZ et al (2015) Can the Air Pollution Index be used to communicate the health risks of air pollution? Environ Pollut 205:153–160

    Article  Google Scholar 

  8. Sohn SY, Kim DH, Yoon JH (2016) Technology credit scoring model with fuzzy logistic regression. Appl Soft Comput 43:150–158

    Article  Google Scholar 

  9. Chiteka K, Enweremadu CC (2016) Prediction of global horizontal solar irradiance in Zimbabwe using artificial neural networks. J Clean Prod 135:701–711

    Article  Google Scholar 

  10. Malvoni M, De Giorgi MG, Congedo PM (2016) Data on support vector machines model to forecast photovoltaic power. Data Brief 9:13–16

    Article  Google Scholar 

  11. Reikard G (2012) Forecasting volcanic air pollution in Hawaii: tests of time series models. Atmos Environ 60:593–600

    Article  Google Scholar 

  12. Feng X, Li Q, Zhu Y et al (2015) Artificial neural networks forecasting of PM2.5 pollution using air mass trajectory based geographic model and wavelet transformation. Atmos Environ 107:118–128

    Article  Google Scholar 

  13. Bai Y, Li Y, Wang X et al (2016) Air pollutants concentrations forecasting using back propagation neural network based on wavelet decomposition with meteorological conditions. Atmos Pollut Res 7(3):557–566

    Article  Google Scholar 

  14. García Nieto PJ, Combarro EF, del Coz Díaz JJ et al (2013) A SVM-based regression model to study the air quality at local scale in Oviedo urban area (Northern Spain): a case study. Appl Math Comput 219 (17):8923–8937

    Google Scholar 

  15. Dumitrache RC, Iriza A, Maco BA et al (2016) Study on the influence of ground and satellite observations on the numerical air-quality for PM10 over Romanian territory. Atmos Environ 143:278–289

    Article  Google Scholar 

  16. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501

    Article  Google Scholar 

  17. Huang GB, Zhu QY, Siew CK (2006) Universal appximation using incremental constructive feed-forward network with random hidden nodes. IEEE Trans Neural Netw 17:879–892

    Article  Google Scholar 

  18. Partalas I, Tsoumakas G, Vlahavas I (2009) Pruning an ensemble of classifiers via reinforcement learning. Neurocomputing 72:1900–1909

    Article  Google Scholar 

  19. Yang C, Yin X, Hao H et al (2014) Classifier ensemble with diversity: effectiveness analysis and ensemble optimization. Acta Automat Sin 40(4):660–674

    MathSciNet  Google Scholar 

  20. Kuncheva LI, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51(2):181–207

    Article  MATH  Google Scholar 

  21. Lu H, An C, Ma X et al (2013) Disagreement measure based ensemble of extreme learning machine for gene expression data classification. Chin J Comput 36(2):341–348

    Article  Google Scholar 

  22. Lu H, An C, Zheng E, Lu Y (2014) Dissimilarity based ensemble of extreme learning machine for gene expression data classification. Neurocomputing 128:22–30

    Article  Google Scholar 

  23. Li X, Shao Z, Qian J (2002) An optimizing method based on autonomous animates: fish swarm algorithm. Syst Eng Theory Pract 22(11):32–38

    Google Scholar 

  24. Zhu X, Ni Z, Cheng M (2015) Self-adaptive improved artificial fish swarm algorithm with changing step. Comput Sci 42 (2):210–216 + 246

    Google Scholar 

  25. Azad MAK, Rocha AMAC, Fernandes EMGP (2014) A simplified binary artificial fish swarm algorithm for 01 quadratic knapsack problems. J Comput Appl Math 259:897–904

    Article  MathSciNet  MATH  Google Scholar 

  26. Chen Y, Zhu Q, Xu H (2015) Finding rough set reducts with fish swarm algorithm. Knowl-Based Syst 81:22–29

    Article  Google Scholar 

  27. Azad MAK, Rocha AMAC, Fernandes EMGP (2014) Improved binary artificial fish swarm algorithm for the 01 multidimensional knapsack problems. Swarm Evol Comput 14:66–75

    Article  Google Scholar 

  28. Luan XY, Li ZP, Liu TZ (2016) A novel attribute reduction algorithm based on rough set and improved artificial fish swarm algorithm. Neurocomputing 174:522–529

    Article  Google Scholar 

  29. Breiman L (1996) Bagging predictors. Mach Learn 24:123–140

    MathSciNet  MATH  Google Scholar 

  30. Mordelet F, Vert JP (2014) A bagging SVM to learn from positive and unlabeled examples. Pattern Recogn Lett 37:201–209

    Article  Google Scholar 

  31. Zou PC, Wang JD, Chen SC et al (2014) Bagging-like metric learning for support vector regression. Knowl-Based Syst 65:21–30

    Article  Google Scholar 

  32. Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55:119–139

    Article  MathSciNet  MATH  Google Scholar 

  33. Shigei N, Miyajima H, Maeda M et al (2009) Bagging and AdaBoost algorithms for vector quantization. Neurocomputing 73(1):106–114

    Article  Google Scholar 

  34. García-Pedrajas N, Haro-García A (2014) Boosting instance selection algorithms. Knowl-Based Syst 67:342–360

    Article  Google Scholar 

  35. Suresh S, Venkatesh Babu R, Kim HJ (2009) No-reference image quality assessment using modified extreme learning machine classifier. Appl Soft Comput 9(2):541–552

    Article  Google Scholar 

  36. Tian H, Meng B (2010) A new modeling method based on bagging ELM for day-ahead electricity price prediction. Bio-Inspired Comput: Theor Appl (BIC-TA) 1076–1079

  37. Tian HX, Mao ZZ (2010) An ensemble ELM based on modified AdaBoost. RT algorithm for predicting the temperature of molten steel in ladle furnace. IEEE Trans Autom Sci Eng 7(1):73–80

    Article  Google Scholar 

  38. Cao W, Lin ZP, Huang GB, Liu N (2012) Voting based extreme learning machine. Inf Sci 185 (1):66–77

    Article  MathSciNet  Google Scholar 

  39. Zhang T, Dai Q, Ma Z (2015) Extreme learning machines’ ensemble selection with GRASP. Appl Intell 43(2):439–459

    Article  Google Scholar 

  40. Martínez-Muñoz G, Suárez A (2004) Aggregation ordering in bagging. In: Proceedings of the IASTED international conference on artificial intelligence and applications, pp 258–263

  41. Martínez-Muñoz G, Suárez A (2006) Pruning in ordered bagging ensembles. In: Proceedings of the twenty-third international conference on machine learning, pp 609–616

  42. Martínez-Muñoz G, Hernández-Lobato D, Suárez A (2009) An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans Pattern Anal Mach Intell 31(2):245–259

    Article  Google Scholar 

  43. Margineantu DD, Dietterich TG (1997) Pruning adaptive boosting. In: Proceedings of the fourteenth international conference on machine learning, vol 97, pp 211–218

  44. Guo L, Boukir S (2013) Margin-based ordered aggregation for ensemble pruning. Pattern Recogn Lett 34:603–609

    Article  Google Scholar 

  45. Dai Q, Zhang T, Liu N (2015) A new reverse reduce-error ensemble pruning algorithm. Appl Soft Comput 28:237–249

    Article  Google Scholar 

  46. Zhou ZH, Wu J, Tang W (2002) Ensembling neural networks: many could be better than all. Artif Intell 137(1):239–263

    Article  MathSciNet  MATH  Google Scholar 

  47. Ni Z, Zhang C, Ni L (2016) A haze forecast method of selective ensemble based on glowworm swarm optimization algorithm. Int J Pattern Recognit Artif Intell 29(2):143–153

    MathSciNet  Google Scholar 

  48. Cavalcanti GDC, Oliveira LS, Moura TJM et al (2016) Combining diversity measures for ensemble pruning. Pattern Recogn Lett 74:38–45

    Article  Google Scholar 

  49. Ykhlef H, Bouchaffra D (2017) An efficient ensemble pruning approach based on simple coalitional games. Inf Fusion 34:28–42

    Article  Google Scholar 

  50. Tang EK, Suganthan PN, Yao X (2006) An analysis of diversity measures. Mach Learn 65:247–271

    Article  Google Scholar 

  51. Zhang L, Zhang B (2001) Good point set based genetic algorithm. Chin J Comput 24(9):917–922

    MathSciNet  Google Scholar 

  52. Zhang C, Ni Z, Ni L et al (2016) Feature selection method based on multi-fractal dimension and harmony search algorithm and its application. Int J Syst Sci 47(14):3476–3486

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgement

This work was supported by National Nature Science Foundation of China under Grant No. 91546108, No. 71271071, No. 71490725, and No. 71301041, the National Key Research and Development Plan under Grant No. 2016YFF0202604 and the state scholarship fund. The authors would like to thank the reviewers for their comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuhui Zhu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhu, X., Ni, Z., Cheng, M. et al. Selective ensemble based on extreme learning machine and improved discrete artificial fish swarm algorithm for haze forecast. Appl Intell 48, 1757–1775 (2018). https://doi.org/10.1007/s10489-017-1027-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-017-1027-8

Keywords

Navigation