Skip to main content
Log in

Missing multi-label learning with non-equilibrium based on two-level autoencoder

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

For a multi-label learning framework, each instance may belong to multiple labels simultaneously. The classification accuracy can be improved significantly by exploiting various correlations, such as label correlations, feature correlations, or the correlations between features and labels. There are few studies on how to combine the feature and label correlations, and they deal more with complete data sets. However, missing labels or other phenomena often occur because of the cost or technical limitations in the data acquisition process. A few label completion algorithms currently suitable for missing multi-label learning, ignore the noise interference of the feature space. At the same time, the threshold of the discriminant function often affects the classification results, especially those of the labels near the threshold. All these factors pose considerable difficulties in dealing with missing labels using label correlations. Therefore, we propose a missing multi-label learning algorithm with non-equilibrium based on a two-level autoencoder. First, label density is introduced to enlarge the classification margin of the label space. Then, a new supplementary label matrix is augmented from the missing label matrix with the non-equilibrium label completion method. Finally, considering feature space noise, a two-level kernel extreme learning machine autoencoder is constructed to implement the information feature and label correlation. The effectiveness of the proposed algorithm is verified by many experiments on both missing and complete label data sets. A statistical analysis of hypothesis validates our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. http://mulan.sourceforge.net/datasets-mlc.html

  2. http://palm.seu.edu.cn/zhangml/

References

  1. Gibaja E, Ventura S (2015) A tutorial on multilabel learning[J]. ACM Comput Surv 47(3):1–38

    Article  Google Scholar 

  2. Zhu X, Li X, Zhang S (2016) Block-row sparse multiview multilabel learning for image classification [J]. IEEE Trans Cybern 46(2):450–461

    Article  Google Scholar 

  3. Zhang ML, Zhou ZH (2006) Multilabel neural networks with applications to functional genomics and text categorization [J]. IEEE Trans Knowl Data Eng 18(10):1338–1351

  4. Boutell MR, Luo J, Shen X, Brown CM (2004) Learning multi-label scene classification [J]. Pattern Recogn 37(9):1757–1771

    Article  Google Scholar 

  5. Zhang ML, Zhou ZH (2007) ML-KNN: a lazy learning approach to multi-label learning [J]. Pattern Recogn 40(7):2038–2048

    Article  Google Scholar 

  6. Yu Y, Pedrycz W, Mia D (2014) Multi-label classification by exploiting label correlations [J]. Expert Syst Appl 41(6):2989–3004

    Article  Google Scholar 

  7. Elisseeff A, Weston J (2002) A kernel method for multi-labelled classification [J]. Adv Neural Inf Proces Syst:681–687

  8. Zhang ML, Zhang K (2010) Multi-label learning by exploiting label dependency [C]. In Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 999–1007

  9. Luo FF, Guo WZ, Yu YL, Chen GL (2017) A multi-label classification algorithm based on kernel extreme learning machine [J]. Neurocomputing 260:313–320

    Article  Google Scholar 

  10. Gao N, Huang SJ, Chen S (2016) Multi-label active learning by model guided distribution matching [J]. Front Comput Sci 10(5):845–855

    Article  Google Scholar 

  11. Huang SJ, Chen S, Zhou ZH (2015) Multi-label active learning: Query type matters[C]. In Proceedings of the 24th International Joint Conference on Artificial Intelligence 946–952

  12. Xu M, Jin R, Zhou ZH (2013) Speedup matrix completion with side information: application to multi-label learning [J]. Adv Neural Inf Proces Syst 26:2301–2309

    Google Scholar 

  13. Xu LL, Wang Z, Shen Z, Wang YB, Chen EH (2014) Learning low-rank label correlations for multi-label classification with missing labels [C]. 2014 IEEE International Conference on Data Mining, 1067–1072

  14. Zhu Y, Kwok JT, Zhou ZH (2018) Multi-label learning with global and local label correlation [C]. IEEE Trans Knowl Data Eng 30(6):1081–1094

    Article  Google Scholar 

  15. Huang J, Qin F, Zheng X, Cheng ZK, Yuan ZX, Zhang WG, Huang QM (2019) Improving multi-label classification with missing labels by learning label-specific features [J]. Inf Sci 492:124–146

    Article  MathSciNet  Google Scholar 

  16. Cheng YS, Zhao DW, Qian K (2018) Multi-label learning for non-equilibrium labels completion in neighborhood labels space. Pattern Recognit Artif Intell 31(8):740–749

    Google Scholar 

  17. Cheng YS, Zhao DW, Wang YB, Pei GS (2019) Multi-label learning with kernel extreme learning machine autoencoder [J]. Knowl-Based Syst 178:1–10

    Article  Google Scholar 

  18. Cheng YS, Qian K, Wang YB, Zhao DW (2020) Missing multi-label learning with non-equilibrium based on classification margin [J]. Appl Soft Comput 86:105924

    Article  Google Scholar 

  19. Bucak SS, Jin R, Jain AK (2011) Multi-label learning with incomplete class assignments[C]. CVPR 2011. IEEE 2801–2808

  20. Wicker J, Tyukin A, Kramer S (2016) A nonlinear label compression and transformation method for multi-label classification using autoencoders[C], Pacific-Asia Conference on Knowledge Discovery and Data Mining, 328–340

  21. Huang M, Zhuang FZ, Zhang X, Ao X, Niu ZY, Zhang ML, He Q (2019) Supervised representation learning for multi-label classification [J]. Mach Learn 108(5):747–763

    Article  MathSciNet  Google Scholar 

  22. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications [J]. Neurocomputing 70(1–3):489–501

    Article  Google Scholar 

  23. Huang G, Huang GB, Song SJ, You KY (2015) Trends in extreme learning machines: a review [J]. Neural Netw Off J Int Neural Netw Soc 61:32–48

    Article  Google Scholar 

  24. Vincent P, Larochelle H, Bengio Y, Manzagol PA (2008) Extracting and composing robust features with denoising autoencoders [C], proceedings of the 25th international conference on machine learning. ACM, 1096–1103

  25. Zhang N, Ding SF (2017) Unsupervised and semi-supervised extreme learning machine with wavelet kernel for high dimensional data [J]. Memet Comput 8(2):129–139

    Article  Google Scholar 

  26. Zhang N, Ding SF, Shi ZZ (2016) Denoising Laplacian multi-layer extreme learning machine [J]. Neurocomputing 171:1066–1074

    Article  Google Scholar 

  27. Zhang ML, Zhou ZH (2014) A review on multi-label learning algorithms [J]. IEEE Trans Knowl Data Eng 26(8):1819–1837

    Article  Google Scholar 

  28. Ding SF, Zhang N, Zhang J, Xu XZ, Shi ZZ (2017) Unsupervised extreme learning machine with representational features [J]. Int J Mach Learn Cybern 8(2):587–595

    Article  Google Scholar 

  29. Ding SF, Guo LL, Hou YL (2017) Extreme learning machine with kernel model based on deep learning[J]. Neural Comput & Applic 28(8):1975–1984

    Article  Google Scholar 

  30. Deng WY, Zheng QH, Chen L, Xu XB (2010) Research on extreme learning of neural networks [J]. Chin J Comput 33(2):279–287

    Article  MathSciNet  Google Scholar 

  31. Chen NY, Liu Y, Chen HQ, Cheng JJ (2017) Detecting communities in social networks using label propagation with information entropy [J]. Phys A Stat Mech Appl 471:788–798

    Article  Google Scholar 

  32. Li DD, Wang Z, Cao C, Liu Y (2018) Information entropy based sample reduction for support vector data description [J]. Appl Soft Comput 71:1153–1160

    Article  Google Scholar 

  33. Lu X, Tsao Y, Matsuda S, Hori C (2013) Speech enhancement based on deep denoising autoencoder [C], Interspeech, 436–440

  34. Tsoumakas G, Katakis L, Vlahavas L (2011) Random k-Labelsets for multilabel classification [J]. IEEE Trans Knowl Data Eng 23(7):1079–1089

    Article  Google Scholar 

  35. Zhang ML, Pena JM, Robles V (2009) Feature selection for multi-label naive Bayes classification [J]. Inf Sci 179(19):3218–3229

    Article  Google Scholar 

  36. Zhang ML, Wu L (2015) Lift: multi-label learning with label-specific features [J]. IEEE Trans Pattern Anal Mach Intell 37(1):107–120

    Article  Google Scholar 

  37. Demsar J (2006) Statistical comparisons of classifiers over multiple data sets [J]. J Mach Learn Res 7(1):1–30

    MathSciNet  MATH  Google Scholar 

  38. Williamson DF, Parker RA, Kendrick JS (1989) The box plot: a simple visual method to interpret data [J]. Ann Intern Med 110(11):916–921

    Article  Google Scholar 

Download references

Acknowledgements

This research is supported by Key Laboratory of Intelligent Computing & Signal Processing, Ministry of Education (Anhui University)(2020A003).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yusheng Cheng.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cheng, Y., Song, F. & Qian, K. Missing multi-label learning with non-equilibrium based on two-level autoencoder. Appl Intell 51, 6997–7015 (2021). https://doi.org/10.1007/s10489-020-02140-1

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-020-02140-1

Keywords

Navigation