Abstract
Negative sampling plays an important role in knowledge graph embedding. A high-quality negative sample can push the model training to the limit. Most negative triples generated by simple uniform sampling are low-quality negative samples, which will lead to the problem of vanishing gradients in the training process. Generative Adversarial Network (GAN) has been used in the study of negative sampling methods. However, the training for the GAN-based negative sampling method is more complicated. To solve these issues, we propose DCNS. DCNS designs two caches containing high-quality negative triples, samples from the cache and updates the cache. In addition, in order to generate harder negative samples that have a greater impact on training, DCNS adopts a mixing operation. Finally, we evaluated the results of the link prediction model using DCNS on four standard datasets. The extensive experiments show that our method can gain significant improvement on various KG embedding models, and outperform the state-of-the-art negative sampling methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ahrabian, K., Feizi, A., Salehi, Y., Hamilton, W.L., Bose, A.J.: Structure aware negative sampling in knowledge graphs. arXiv preprint arXiv:2009.11355 (2020)
Bollacker, K., Evans, C., Paritosh, P., Sturge, T., Taylor, J.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, pp. 1247–1250 (2008)
Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: Advances in Neural Information Processing Systems, vol. 26 (2013)
Cai, L., Wang, W.Y.: KBGAN: adversarial learning for knowledge graph embeddings. In: Proceedings of NAACL (2018)
Dettmers, T., Minervini, P., Stenetorp, P., Riedel, S.: Convolutional 2d knowledge graph embeddings. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)
Huang, T., et al.: MixGCF: an improved training method for graph neural network-based recommender systems. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pp. 665–674 (2021)
Isinkaye, F.O., Folajimi, Y.O., Ojokoh, B.A.: Recommendation systems: principles, methods and evaluation. Egyptian Inf. J. 16(3), 261–273 (2015)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Proceedings of the 3rd International Conference on Learning Representations (2014)
Koller, D., et al.: Introduction to Statistical Relational Learning. MIT Press, Cambridge (2007)
Lee, K., Zhu, Y., Sohn, K., Li, C.L., Shin, J., Lee, H.: i-mix: a domain-agnostic strategy for contrastive representation learning. In: International Conference on Learning Representations (2021)
Miller, G.A.: Wordnet: a lexical database for English. Commun. ACM 38(11), 39–41 (1995)
Mitchell, T., et al.: Never-ending learning. Commun. ACM 61(5), 103–115 (2018)
Song, X., Li, J., Cai, T., Yang, S., Yang, T., Liu, C.: A survey on deep learning based knowledge tracing. Knowl.-Based Syst. 258, 110036 (2022)
Song, X., Li, J., Lei, Q., Zhao, W., Chen, Y., Mian, A.: Bi-clkt: Bi-graph contrastive learning based knowledge tracing. Knowl.-Based Syst. 241, 108274 (2022)
Suchanek, F.M., Kasneci, G., Weikum, G.: Yago: a core of semantic knowledge. In: Proceedings of WWW, pp. 697–706 (2007)
Sun, Z., Deng, Z.H., Nie, J.Y., Tang, J.: Rotate: knowledge graph embedding by relational rotation in complex space. In: ICLR (2019)
Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: Proceedings of the 3rd workshop on Continuous Vector Space Models and Their Compositionality, pp. 57–66 (2015)
Wang, C., Wang, X., Li, Z., Chen, Z., Li, J.: Hyconve: a novel embedding model for knowledge hypergraph link prediction with convolutional neural networks. In: Proceedings of the ACM Web Conference 2023, pp. 188–198 (2023)
Wang, P., Li, S., Pan, R.: Incorporating GAN for negative sampling in knowledge representation learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)
Wang, Q., Mao, Z., Wang, B., Guo, L.: Knowledge graph embedding: a survey of approaches and applications. IEEE Trans. Knowl. Data Eng. 29(12), 2724–2743 (2017)
Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28 (2014)
Xiao, H., Huang, M., Hao, Y., Zhu, X.: TransG: a generative mixture model for knowledge graph embedding. arXiv preprint arXiv:1509.05488 (2015)
Xiong, C., Power, R., Callan, J.: Explicit semantic ranking for academic search via knowledge graph embedding. In: Proceedings of the 26th International Conference on World Wide Web, pp. 1271–1279 (2017)
Yao, X., Van Durme, B.: Information extraction over structured data: Question answering with freebase. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, pp. 956–966 (2014)
Zhang, Y., Yao, Q., Shao, Y., Chen, L.: NSCaching: simple and efficient negative sampling for knowledge graph embedding. In: 2019 IEEE 35th International Conference on Data Engineering, pp. 614–625. IEEE (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zheng, H., Guan, D., Xu, S., Yuan, W. (2024). DCNS: A Double-Cache Negative Sampling Method for Improving Knowledge Graph Embedding. In: Song, X., Feng, R., Chen, Y., Li, J., Min, G. (eds) Web and Big Data. APWeb-WAIM 2023. Lecture Notes in Computer Science, vol 14334. Springer, Singapore. https://doi.org/10.1007/978-981-97-2421-5_29
Download citation
DOI: https://doi.org/10.1007/978-981-97-2421-5_29
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-2420-8
Online ISBN: 978-981-97-2421-5
eBook Packages: Computer ScienceComputer Science (R0)