Abstract
Domain adaptation aims to diminish the discrepancy between the source and target domains and enhance the classification ability for the target samples by using well-labeled source domain data. However, most existing methods concentrate on learning domain invariant features for cross-domain tasks, but ignore the correlation and discriminative information between different domains. If the learned features from the source and target domains are not correlated, the adaptability of domain adaptation methods will be greatly degraded. To make up for this deficiency, we propose a novel domain adaptation approach, referred to as Canonical Correlation Discriminative Learning (CCDL) for domain adaptation. By introducing a novel correlation representation, CCDL maximizes the correlations of the learned features from the two domains as much as possible. Specifically, CCDL learns a latent feature representation to reduce the difference by jointly adapting the marginal and conditional distributions between the source and target domains, and simultaneously maximizes the inter-class distance and minimizes the intra-class scatter. The experiments certify that CCDL is superior to several state-of-the-art methods on four visual benchmark databases.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2010)
Long, M.S., Wang, J.M., Ding, G.G., Sun, J.G., Yu, P.S.: Transfer joint matching for unsupervised domain adaptation. In: IEEE Conference on Computer Vision and Pattern Recognition (2014)
Hu, J.L., Lu, J.W., Tan, Y.P.: Deep transfer metric learning. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 325–333 (2015)
Long, M.S., Wang, J.M., Ding, G.G., Sun, J.G., Yu, P.S.: Transfer feature learning with joint distribution adaptation. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2200–2207 (2013)
Pan, S.J., Tsang, I.W., Kwok, J.T., Yang, Q.: Domain adaptation via transfer component analysis. IEEE Trans. Neural Netw. 22(2), 199–210 (2011)
Hotelling, H.: Relations between two sets of variates. Biometrika (1936)
Chu, D.L., Liao, L.Z., Ng, M.K., Zhang, X.W.: Sparse canonical correlation analysis: new formulation and algorithm. IEEE Trans. Pattern Anal. Mach. Intell. 35(12), 3050–3065 (2013)
Xia, R., Hu, X.L., Lu, J.F., Yang, J., Zong, C.Q.: Instance selection and instance weighting for cross-domain sentiment classification via PU learning. In: International Conference on Artificial Intelligence, pp. 2176–2182 (2013)
Wang, J.D., Feng, W.J., Chen, Y.Q., Yu, H., Huang, M.Y., Yu, P.S.: Visual domain adaptation with manifold embedded distribution alignment. In: ACM International Conference on Multimedia (2018)
Li, S., Song, S.J., Huang, G.: Prediction reweighting for domain adaptation. IEEE Trans. Neural Netw. Learn. Syst. 28(7), 1682–1695 (2017)
Huang, J.Y., Smola, A.J., Gretton, A., Borgwardt, K.M., Scholkopf, B.: Correcting sample selection bias by unlabeled data. In: Neural Information Processing Systems, pp. 601–608 (2007)
Li, S., Song, S.J., Huang, G., Ding, Z.M., Wu, C.: Domain invariant and class discriminative feature learning for visual domain adaptation. IEEE Trans. Image Process. 27(9), 4260–4272 (2018)
Gong, B.Q., Shi, Y., Sha, F., Grauman, K.: Geodesic flow kernel for unsupervised domain adaptation. In: IEEE Conference on Computer Vision and Pattern Recognition (2012)
Deng, W.Y., Lendasse, A., Ong, Y.S., Tsang, I.W.H., Chen, L., Zheng, Q.H.: Domain adaptation via feature selection on explicit feature map. IEEE Trans. Neural Netw. Learn. Syst. 30(4), 1180–1190 (2019)
Zhang, L., Wang, P., Wei, W., Lu, H., Shen, C.H.: Unsupervised domain adaptation using robust class-wise matching. IEEE Trans. Circ. Syst. Video Technol. 29(5), 1339–1349 (2019)
Yan, K., Kou, L., Zhang, D.: Learning domain-invariant subspace using domain features and independence maximization. IEEE Trans. Cybern. 48(1), 288–299 (2018)
Wold, S., Esbensen, K., Geladi, P.: Principal component analysis. Chemometr. Intell. Lab. Syst. 2(1), 37–52 (1987)
Wang, J.D., Chen, Y.Q., Hao, S.J., Feng, W.J., Shen, Z.Q.: Balanced distribution adaptation for transfer learning. In: IEEE Conference on Data Mining, pp. 1129–1134 (2017)
Zhang, J., Li, W.Q., Ogunbona, P.: Joint geometrical and statistical alignment for visual domain adaptation. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 5150–5158 (2017)
Li, J.J., Jing, M.M., Lu, K., Zhu, L., Shen, H.T.: Locality preserving joint transfer for domain adaptation. IEEE Trans. Image Process. 28(12), 6103–6115 (2019)
Wang, J.D., Chen, Y.Q., Yu, H., Huang, M.Y., Yang, Q.: Easy transfer learning by exploiting intra-domain structures. In: IEEE International Conference on Multimedia and Expo, pp. 1210–1215 (2019)
Maaten, L.V.D., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9(11), 2579–2605 (2008)
Acknowledgements
This work was supported by the National Natural Science Foundation of China (Grant Nos. 61672357, 61732011), the Guangdong Basic and Applied Basic Research Foundation (2019A1515011493), the Natural Science Foundation of Shenzhen University (No. 2019046), and the Science Foundation of Shenzhen (Grant No. JCYJ20160422144110140).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, W., Lu, Y., Lai, Z. (2020). Canonical Correlation Discriminative Learning for Domain Adaptation. In: Bäck, T., et al. Parallel Problem Solving from Nature – PPSN XVI. PPSN 2020. Lecture Notes in Computer Science(), vol 12269. Springer, Cham. https://doi.org/10.1007/978-3-030-58112-1_39
Download citation
DOI: https://doi.org/10.1007/978-3-030-58112-1_39
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-58111-4
Online ISBN: 978-3-030-58112-1
eBook Packages: Computer ScienceComputer Science (R0)