Abstract
Multi-label transfer learning aiming to learn robust classifiers for the target domain by leveraging knowledge from a source domain has been received considerable attention recently. The core part of such research is similarity measurement. Nevertheless, the existing similarity measurement functions of probability distributions are still too simple to fully describe the similarity of probability distributions.. In order to address this problem, we propose a Multi-label Transfer Learning Via Latent Graph Alignment (G-MLTL). G-MLTL uses subspace learning to make the feature distribution of the target domain consistent with the source domain. At the same time, G-MLTL decomposes the label matrix to ensure data points sharing the same labels to have identical latent semantic representation in the new reconstruction space. The proposed G-MLTL also focuses on directly utilizing Latent Graph Alignment to guide the knowledge transfer process. Extensive experiments demonstrate that G-MLTL significantly outperforms the existing multi-label transfer learning methods. Especially when the number of labels is more than four, the mean Average Precision of G-MLTL is higher than the baseline algorithm by 2.1-10.5%.
Similar content being viewed by others
Notes
Corel5k https://github.com/watersink/Corel5K.
ESPGame http://www.hunch.net/~jl/
IAPRTC12 https://www.imageclef.org/photodata
References
Balasubramaniam, T., Nayak, R., Yuen, C.: Transfer learning via feature selection based nonnegative matrix factorization. In: International Conference on Web Information Systems Engineering, pp 82–97. Springer (2020)
Chen, Z., Yuan, L., Lin, X., Qin, L., Yang, J.: Efficient maximal balanced clique enumeration in signed networks. In: Proceedings of The Web Conference 2020, pp 339–349 (2020)
Chougrad, H., Zouaki, H., Alheyane, O.: Multi-label transfer learning for the early diagnosis of breast cancer. Neurocomputing 392, 168–180 (2020)
Gretton, A., Borgwardt, K. M., Rasch, M. J., Schölkopf, B., Smola, A.: A kernel two-sample test. J. Mach. Learn. Res. 13(1), 723–773 (2012)
Han, Y., Wu, F., Zhuang, Y., He, X.: Multi-label transfer learning with sparse representation. IEEE Trans Circuits Systems Video Technol 20(8), 1110–1121 (2010)
Hu, S., Li, B., Lin, K., Wang, R., Liu, K.: Text sentiment transfer methods by using sentence keywords. In: International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, pp 36–48. Springer (2020)
Jameson, A.: Solution of the equation ax+xb=c by inversion of an m*m or n*n matrix. SIAM J. Appl. Math. 16(5), 1020–1023 (1968)
Jiang, S., Xu, Y., Wang, T., Yang, H., Qiu, S., Yu, H., Song, H.: Multi-label metric transfer learning jointly considering instance space and label space distribution divergence. IEEE Access 7, 10362–10373 (2019)
Joachims, T.: Optimizing search engines using clickthrough data. In: Proceedings of the Eighth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp 133–142 (2002)
Liu, B., Yuan, L., Lin, X., Qin, L., Zhang, W., Zhou, J.: Efficient (α,, β)-core computation: An index-based approach. In: The World Wide Web Conference, pp 1130–1141 (2019)
Liu, B., Yuan, L., Lin, X., Qin, L., Zhang, W., Zhou, J.: Efficient (α,β)-core computation in bipartite graphs. The VLDB J 29(5), 1075–1099 (2020)
Long, M., Wang, J., Ding, G., Shen, D., Yang, Q.: Transfer learning with graph co-regularization. IEEE Trans. Knowl. Data Eng. 26(7), 1805–1818 (2013)
Long, M., Wang, J., Sun, J., Philip, S. Y.: Domain invariant transfer kernel learning. IEEE Trans. Knowl. Data Eng. 27(6), 1519–1532 (2014)
Ouyang, D., Yuan, L., Qin, L., Chang, L., Zhang, Y., Lin, X.: Efficient shortest path index maintenance on dynamic road networks with theoretical guarantees. Proceedings of the VLDB Endowment 13(5), 602–615 (2020)
Pan, Y., Hong, R., Chen, J., Feng, J., Wu, W.: Performance degradation assessment of wind turbine gearbox based on maximum mean discrepancy and multi-sensor transfer learning. Struct. Health Monit. 20(1), 118–138 (2021)
Pan, S.J., Tsang, I.W., Kwok, J.T., Yang, Q.: Domain adaptation via transfer component analysis. IEEE Trans. Neural Netw. 22(2), 199–210 (2010)
Raab, C, Schleif, F.-M.: Low-rank subspace override for unsupervised domain adaptation. In: German Conference on Artificial Intelligence (künstliche Intelligenz), pp 132–147. Springer (2020)
Razzaghi, P., Razzaghi, P., Abbasi, K.: Transfer subspace learning via low-rank and discriminative reconstruction matrix. Knowl.-Based Syst. 163, 174–185 (2019)
Shen, Z., Cui, P., Zhang, T., Kunag, K.: Stable learning via sample reweighting. Proc. AAAI Conf. Artif. Intell. 34(04), 5692–5699 (2020)
Smith, P.C.: Transfer learning with deep cnns for gender recognition and age estimation. In: 2018 IEEE International Conference on Big Data (Big Data), pp 2564–2571. IEEE (2018)
Wang, J., Feng, W., Chen, Y., Yu, H., Huang, M., Yu, P.S.: Visual domain adaptation with manifold embedded distribution alignment. In: Proceedings of the 26th ACM international conference on Multimedia, pp 402–410 (2018)
Wang, D., Gao, X., Wang, X., He, L.: Label consistent matrix factorization hashing for large-scale cross-modal similarity search. IEEE Trans. Pattern Anal. Mach. Intell. 41(10), 2466–2479 (2018)
Wang, X., Li, L., Wang, D.: Vae-based domain adaptation for speaker verification. In: 2019 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), pp 535–539. IEEE (2019)
Yuan, L., Qin, L., Lin, X., Chang, L., Zhang, W.: Diversified top-k clique search. The VLDB J 25(2), 171–196 (2016)
Zhang, F.C.: msgan: generative adversarial networks for image seasonal style transfer. IEEE Access 8, 104830–104840 (2020)
Zhang, D., Nallapati, R., Zhu, H., Nan, F., dos Santos, C., Mckeown, K., Xiang, B.: Unsupervised domain adaptation for cross-lingual text labeling. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings, pp 3527–3536 (2020)
Zhang, C., Song, J., Zhu, X., Zhu, L., Zhang, S.: hcmsl: Hybrid cross-modal similarity learning for cross-modal retrieval. ACM Trans. Multimedia Comput. Commun. Appl. (TOMM) 17(1s), 1–22 (2021)
Zhang, M.-L., Zhou, Z.-H.: A review on multi-label learning algorithms. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2013)
Zhao, X., Xiao, C., Lin, X., Zhang, W., Wang, Y.: Efficient structure similarity searches: a partition-based approach. The VLDB J 27(1), 53–78 (2018)
Zhu, L., Song, J., Zhu, X., Zhang, C., Zhang, S., Yuan, X.: Adversarial learning-based semantic correlation representation for cross-modal retrieval. IEEE MultiMedia 27(4), 79–90 (2020)
Zhu, L., Zhang, C., Song, J., Liu, L., Zhang, S., Li, Y.: Multi-graph based hierarchical semantic fusion for cross-modal representation. In: 2021 IEEE International Conference on Multimedia and Expo (ICME), pp 1–6. IEEE (2021)
Zhuang, N., Yan, Y., Chen, S., Wang, H., Shen, C.: Multi-label learning based deep transfer neural network for facial attribute classification. Pattern Recogn. 80, 225–240 (2018)
Acknowledgements
This article has been awarded by the National Natural Science Foundation of China (61941113), supported by Science and Technology on Information System Engineering Laboratory (No: 05202004), Nanjing Science and Technology Development Plan Project (201805036).
Author information
Authors and Affiliations
Corresponding authors
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This article belongs to the Topical Collection: Special Issue on Large Scale Graph Data Analytics
Guest Editors: Xuemin Lin, Lu Qin, Wenjie Zhang, and Ying Zhang
Rights and permissions
About this article
Cite this article
Sang, J., Wang, Y., Yuan, L. et al. Multi-label transfer learning via latent graph alignment. World Wide Web 25, 879–898 (2022). https://doi.org/10.1007/s11280-021-00928-w
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11280-021-00928-w