Skip to main content
Log in

When semi-supervised learning meets ensemble learning

  • Research Article
  • Published:
Frontiers of Electrical and Electronic Engineering in China

Abstract

Semi-supervised learning and ensemble learning are two important machine learning paradigms. The former attempts to achieve strong generalization by exploiting unlabeled data; the latter attempts to achieve strong generalization by using multiple learners. Although both paradigms have achieved great success during the past decade, they were almost developed separately. In this paper, we advocate that semi-supervised learning and ensemble learning are indeed beneficial to each other, and stronger learning machines can be generated by leveraging unlabeled data and classifier combination.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Chapelle O, Schölkopf B, Zien A. Semi-Supervised Learning. Cambridge: MIT Press, 2006

    Google Scholar 

  2. Zhou Z H, Li M. Semi-supervised learning by disagreement. Knowledge and Information Systems, 2010, 24(3): 415–439

    Article  Google Scholar 

  3. Zhu X. Semi-supervised learning literature survey. Technical Report 1530. Madison: University of Wisconsin at Madison, Department of Computer Sciences, 2006. http://www.cs.wisc.edu/~jerryzhu/pub/ssl_survey.pdf

    Google Scholar 

  4. Zhou Z H. Ensemble Learning. In: Li SZ, ed. Encyclopedia of Biometrics. Berlin: Springer, 2009, 270–273

    Google Scholar 

  5. Bennett K, Demiriz A, Maclin R. Exploiting unlabeled data in ensemble methods. In: Proceedings of the 8th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2002, 289–296

  6. d’Alché-Buc F, Grandvalet Y, Ambroise C. Semi-Supervised MarginBoost. In: Dietterich T G, Becker S, Ghahramani Z, eds. Advances in Neural Information Processing Systems 14. 2002, 553–560

  7. Li M, Zhou Z H. Improve computer-aided diagnosis with machine learning techniques using undiagnosed samples. IEEE Transactions on Systems, Man and Cybernetics-Part A: Systems and Humans, 2007, 37(6): 1088–1098

    Article  Google Scholar 

  8. Mallapragada P K, Jin R, Jain A K, Liu Y. SemiBoost: boosting for semi-supervised learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009, 31(11): 2000–2014

    Article  Google Scholar 

  9. Valizadegan H, Jin R, Jain A K. Semi-supervised boosting for multi-class classification. In: Proceedings of the 19th European Conference on Machine Learning. 2008, 522–537

  10. Zhou Z H, Li M. Tri-training: Exploiting unlabeled data using three classifiers. IEEE Transactions on Knowledge and Data Engineering, 2005, 17(11): 1529–1541

    Article  Google Scholar 

  11. Zhou Z H. When semi-supervised learning meets ensemble learning. In: Proceedings of the 8th International Workshop on Multiple Classifier Systems. 2010, 529–538

  12. Krogh A, Vedelsby J. Neural network ensembles, cross validation, and active learning. In: Tesauro G, Touretzky D S, Leen T K, eds. Advances in Neural Information Processing Systems 7. Cambridge: MIT Press, 1995, 231–238

    Google Scholar 

  13. Brown G. An information theoretic perspective on multiple classifier systems. In: Proceedings of the 8th International Workshop on Multiple Classifier Systems. 2009, 344–353

  14. Zhou Z H, Li N. Multi-information ensemble diversity. In: Proceedings of the 9th International Workshop on Multiple Classifier Systems. 2010, 134–144

  15. Breiman L. Bagging predictors. Machine Learning, 1996, 24(2): 123–140

    MathSciNet  MATH  Google Scholar 

  16. Freund Y, Schapire R E. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 1997, 55(1): 119–139

    Article  MathSciNet  MATH  Google Scholar 

  17. Ho T K. The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1998, 20(8): 832–844

    Article  Google Scholar 

  18. Breiman L. Random forests. Machine Learning, 2001, 45(1): 5–32

    Article  MATH  Google Scholar 

  19. Zhou Z H. Learning with unlabeled data and its application to image retrieval. In: Proceedings of the 9th Pacific Rim International Conference on Artificial Intelligence. 2006, 5–10

  20. Vapnik V N. Statistical Learning Theory. New York: Wiley, 1998

    MATH  Google Scholar 

  21. Settles B. Active learning literature survey. Technical Report 1648. Wisconsin: University of Wisconsin at Madison, Department of Computer Sciences, 2009. http://pages.cs.wisc.edu/bsettles/pub/settles.activelearning.pdf

    Google Scholar 

  22. Miller D J, Uyar H S. A mixture of experts classifier with learning based on both labelled and unlabelled data. In: Mozer M, Jordan M I, Petsche T, eds. Advances in Neural Information Processing Systems 9. Cambridge: MIT Press, 1997, 571–577

    Google Scholar 

  23. Nigam K, McCallum A K, Thrun S, Mitchell T. Text classification from labeled and unlabeled documents using EM. Machine Learning, 2000, 39(2–3): 103–134

    Article  MATH  Google Scholar 

  24. Shahshahani B, Landgrebe D. The effect of unlabeled samples in reducing the small sample size problem and mitigating the hughes phenomenon. IEEE Transactions on Geoscience and Remote Sensing, 1994, 32(5): 1087–1095

    Article  Google Scholar 

  25. Xu L. Bayesian Ying Yang System and Theory as a Unified Statistical Learning Approach: (I) Unsupervised and Semi- Unsupervised Learning. In: Amari S, Kassabov N, eds. Brain-like Computing and Intelligent Information Systems. Berlin: Springer-Verlag, 1997, 241–274

    Google Scholar 

  26. Chapelle O, Zien A. Semi-supervised learning by low density separation. In: Proceedings of the 10th International Workshop on Artificial Intelligence and Statistics. 2005, 57–64

  27. Grandvalet Y, Bengio Y. Semi-supervised Learning by Entropy Minimization. In: Saul L K, Weiss Y, Bottou L, eds. Advances in Neural Information Processing Systems 17. Cambridge: MIT Press, 2005, 529–536

    Google Scholar 

  28. Joachims T. Transductive inference for text classification using support vector machines. In: Proceedings of the 16th International Conference on Machine Learning. 1999, 200–209

  29. Lawrence N D, Jordan M I. Semi-supervised learning via Gaussian processes. In: Saul L K, Weiss Y, Bottou L, eds. Advances in Neural Information Processing Systems 17. Cambridge: MIT Press, 2005, 753–760

    Google Scholar 

  30. Belkin M, Niyogi P. Semi-supervised learning on Riemannian manifolds. Machine Learning, 2004, 56(1–3): 209–239

    Article  MATH  Google Scholar 

  31. Belkin M, Niyogi P, Sindhwani V. On manifold regularization. In: Proceedings of the 10th International Workshop on Artificial Intelligence and Statistics. 2005, 17–24

  32. Belkin M, Niyogi P, Sindhwani V. Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 2006, 7: 2399–2434

    MathSciNet  Google Scholar 

  33. Zhou D, Bousquet O, Lal T N, J. Weston, Schölkopf B. Learning with local and global consistency. In: Thrun S, Saul L, Schölkopf B, eds. Advances in Neural Information Processing Systems 16. 2004, 321–328

  34. Zhu X, Ghahramani Z, Lafferty J. Semi-supervised learning using Gaussian fields and harmonic functions. In: Proceedings of the 20th International Conference on Machine Learning. 2003, 912–919

  35. Blum A, Mitchell T. Combining labeled and unlabeleddata with co-training. In: Proceedings of the 11th Annual Conference on Computational Learning Theory. 1998, 92–100

  36. Dasgupta S, Littman M, McAllester D. PAC Generalization Bunds for Co-training. In: Dietterich T G, Becker S, Ghahramani Z, eds. Advances in Neural Information Processing Systems 14. Cambridge: MIT Press, 2002, 375–382

    Google Scholar 

  37. Balcan M F, Blum A, Yang K. Co-Training and Expansion: Towards Bridging Theory and Practice. In: Saul L K, Weiss Y, Bottou L, eds. Advances in Neural Information Processing Systems 17. Cambridge: MIT Press, 2005, 89–96

    Google Scholar 

  38. Abney S. Bootstrapping. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics. 2002, 360–367

  39. Nigam K, Ghani R. Analyzing the effectiveness and applicability of co-training. In: Proceedings of the 9th ACM International Conference on Information and Knowledge Management, 2000, 86–93

  40. Goldman S, Zhou Y. Enhancing supervised learning with unlabeled data. In: Proceedings of the 17th International Conference on Machine Learning. 2000, 327–334

  41. Zhou Z H, Li M. Semi-supervised regression with cotraining. In: Proceedings of the 19th International Joint Conference on Artificial Intelligence. 2005, 908–913

  42. Zhou Z H, Li M. Semi-supervised regression with cotraining style algorithms. IEEE Transactions on Knowledge and Data Engineering, 2007, 19(11): 1479–1493

    Article  Google Scholar 

  43. Mohamed T A, Gayar N El, Atiya A F. A cotraining approach for time series prediction with missing data. In: Proceedings of the 7th International Workshop on Multiple Classifier Systems. 2007, 93–102

  44. Wang W, Zhou Z H. Analyzing co-training style algorithms. In: Proceedings of the 18th European Conference on Machine Learning, 2007, 454–465

  45. Hwa R, Osborne M, Sarkar A, Steedman M. Corrected cotraining for statistical parsers. In: Working Notes of the ICML’03Workshop on the Continuum from Labeled to Unlabeled Data in Machine Learning and Data Mining. 2003

  46. Pierce D, Cardie C. Limitations of co-training for natural language learning from large data sets. In: Proceedings of the 2001 Conference on Empirical Methods in Natural Language Processing. 2001, 1–9

  47. Sarkar A. Applying co-training methods to statistical parsing. In: Proceedings of the 2nd Annual Meeting of the North American Chapter of the Association for Computational Linguistics. 2001, 95–102

  48. Steedman M, Osborne M, Sarkar A, Clark S, Hwa R, Hockenmaier J, Ruhlen P, Baker S, Crim J. Bootstrapping statistical parsers from small data sets. In: Proceedings of the 11th Conference on the European Chapter of the Association for Computational Linguistics. 2003, 331–338

  49. Li M, Li H, Zhou Z H. Semi-supervised document retrieval. Information Processing and Management, 2009, 45(3): 341–355

    Article  Google Scholar 

  50. Mavroeidis D, Chaidos K, Pirillos S, Christopoulos D, Vazirgiannis M. Using tri-training and support vector machines for addressing the ECML-PKDD 2006 Discovery Challenge. In: Proceedings of ECMLPKDD 2006 Discovery Challenge Workshop. 2006, 39–47

  51. Kockelkorn M, Lneburg A, Scheffer T. Using transduction and multi-view learning to answer emails. In: Proceedings of the 7th European Conference on Principles and Practice of Knowledge Discovery in Databases. 2003, 266–277

  52. Zhou Z H, Chen K J, Dai H B. Enhancing relevance feedback in image retrieval using unlabeled data. ACM Transactions on Information Systems, 2006, 24(2): 219–244

    Article  Google Scholar 

  53. Zhou Z H, Chen K J, Jiang Y. Exploiting unlabeled data in content-based image retrieval. In: Proceedings of the 15th European Conference on Machine Learning. 2004, 525–536

  54. Wang W, Zhou Z H. On multi-view active learning and the combination with semi-supervised learning. In: Proceedings of the 25th International Conference on Machine Learning. 2008, 1152–1159

  55. Muslea I, Minton S, Knoblock C A. Active learning with multiple views. Journal of Artificial Intelligence Research, 2006, 27(1): 203–233

    MathSciNet  MATH  Google Scholar 

  56. Zhou Z H, Zhan D C, Yang Q. Semi-supervised learning with very few labeled training examples. In: Proceedings of the 22nd AAAI Conference on Artificial Intelligence. 2007, 675–680

  57. Hotelling H. Relations between two sets of variates. Biometrika, 1936, 28(4): 321–377

    MATH  Google Scholar 

  58. Hardoon D R, Szedmak S, Shawe-Taylor J. Canonical correlation analysis: an overview with application to learning methods. Neural Computation, 2004, 16(12): 2639–2664

    Article  MATH  Google Scholar 

  59. Zhang M L, Zhou Z H. Classifier ensemble with unlabeled data. CORR abs10909.3593, 2009

  60. Zhang M L, Zhou Z H. Exploiting unlabeled data to enhance ensemble diversity. In: Proceedings of the 9th IEEE International Conference on Data Mining. 2010

  61. Xu L, Amari S. Combining classifiers and learning mixtureof-experts. In: Dopico J R R, Dorado J, Pazos A, eds. Encyclopedia of Artificial Intelligence. 2009, 318–326

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhi-Hua Zhou.

Additional information

Zhi-Hua Zhou received the BSc, MSc and PhD degrees in computer science from Nanjing University, China, in 1996, 1998 and 2000, respectively, all with the highest honors. He joined the Department of Computer Science & Technology at Nanjing University as an assistant professor in 2001, and is currently a professor and director of the LAMDA group. His research interests are in machine learning, data mining, pattern recognition, etc. In these areas he has published over 80 papers in leading international journals or conference proceedings. He is an Associate Editor-in-Chief of Chinese Science Bulletin, Associate Editor of IEEE Transactions on Knowledge and Data Engineering and ACM Transactions on Intelligent Systems and Technology, and on the editorial boards of various journals. He is the founding steering committee co-chair of ACML, steering committee member of PAKDD and PRICAI, and program chair, vice chair or area chair of many conferences. He is the chair of the Machine Learning Society of the Chinese Association of Artificial Intelligence (CAAI), vice chair of the Artificial Intelligence & Pattern Recognition Society of the China Computer Federation (CCF), and chair of the IEEE Computer Society Nanjing Chapter.

About this article

Cite this article

Zhou, ZH. When semi-supervised learning meets ensemble learning. Front. Electr. Electron. Eng. China 6, 6–16 (2011). https://doi.org/10.1007/s11460-011-0126-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11460-011-0126-2

Keywords

Navigation