Skip to main content

A Joint Relation Extraction Model Based on Domain N-Gram Adapter and Axial Attention for Military Domain

  • Conference paper
  • First Online:
Web Information Systems and Applications (WISA 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14094))

Included in the following conference series:

  • 577 Accesses

Abstract

Domain-specific relation extraction plays an important role in constructing domain knowledge graph and further analysis. In the field of military intelligence relation extraction, there are challenges such as relation overlapping and exposure bias. Therefore, on the basis of a combination of domain N-gram adapter and axial attention, this paper presents a single-step joint relation extraction model for the field of military text analysis. Considering domain-specific language structures and patterns, the domain-specific N-gram adapter is incorporated into the pre-trained language model to improve the encoding of the proposed model. Furthermore, the axial attention mechanism is applied to capture the dependencies between token pairs and their contexts, so as to enhance the encoding representation ability of the proposed model. After that, entities and relations are jointly extracted by a relation-specific decoding method. The effectiveness of the proposed model is demonstrated through experiments on a military relation extraction dataset with F1-Score 0.6690 and CMeIE with F1-Score 0.6051, which is better than existing joint relation extraction models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zheng, S., Wang, F., Bao, H., et al.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, vol. 1 (2017)

    Google Scholar 

  2. Li, Q., Ji, H.: Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, vol. 1, pp. 402–412 (2014)

    Google Scholar 

  3. Wang, Y., Yu, B., Zhang, Y., et al.: TPLinker: single-stage joint extraction of entities and relations through token pair linking. In: Proceedings of the 28th International Conference on Computational Linguistics, pp. 1572–1582 (2020)

    Google Scholar 

  4. Zhou, G., Su, J., Zhang, J., Zhang, M.: Exploring various knowledge in relation extraction. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics, pp. 427–434 (2005)

    Google Scholar 

  5. Chan, Y., Roth, D.: Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 551–560 (2011)

    Google Scholar 

  6. Zhong, Z., Chen, D.: A frustratingly easy approach for entity and relation extraction. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 50–61 (2021)

    Google Scholar 

  7. Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings Of the 54th Annual Meeting of the Association for Computational Linguistics, vol. 1 (2016)

    Google Scholar 

  8. Wei, Z., Su, J., Wang, Y., et al.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 1476–1488 (2020)

    Google Scholar 

  9. Zhang, J., Liu, M., Xu, L.: Multiple-granularity graph for document-level relation extraction. In: Zhao, X., Yang, S., Wang, X., Li, J. (eds.) WISA 2022. LNCS, vol. 13579, pp. 126–134. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-20309-1_11

    Chapter  Google Scholar 

  10. Kenton, J., Toutanova, L.: Bert: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NaacL-HLT, vol. 1, p. 2 (2019)

    Google Scholar 

  11. Brown, T., Mann, B., Ryder, N., et al.: Language models are few-shot learners. Adv. Neural. Inf. Process. Syst. 33, 1877–1901 (2020)

    Google Scholar 

  12. Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, pp. 3615–3620 (2019)

    Google Scholar 

  13. Diao, S., Xu, R., Su, H., et al.: Taming pre-trained language models with n-gram representations for low-resource domain adaptation. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, vol. 1, pp. 3336–3349 (2021)

    Google Scholar 

  14. Liu, Y., Ott, M., Goyal, N., et al.: A robustly optimized bert pretraining approach. ArXiv Preprint ArXiv:1907.11692 (2019)

  15. Shang, Y., Huang, H., Mao, X.: OneRel: joint entity and relation extraction with one module in one step. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 11285–11293 (2022)

    Google Scholar 

  16. Wang, H., Zhu, Y., Green, B., Adam, H., Yuille, A., Chen, L.-C.: Axial-DeepLab: stand-alone axial-attention for panoptic segmentation. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020, Part IV. LNCS, vol. 12349, pp. 108–126. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58548-8_7

    Chapter  Google Scholar 

  17. Shaw, P., Uszkoreit, J., Vaswani, A.: Self-attention with relative position representations. In: Proceedings of NaacL-HLT, pp. 464–468 (2018)

    Google Scholar 

  18. Su, J.: GPLinker: joint entity relation extraction based on GlobalPointer (2022). https://kexue.fm/archives/8888

  19. Guan, T., Zan, H., Zhou, X., Xu, H., Zhang, K.: CMeIE: construction and evaluation of Chinese medical information extraction dataset. In: Zhu, X., Zhang, M., Hong, Yu., He, R. (eds.) NLPCC 2020, Part I. LNCS (LNAI), vol. 12430, pp. 270–282. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-60450-9_22

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ziqing Xu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yang, Z., Li, Z., Xu, Z., Gan, Z., Cao, W. (2023). A Joint Relation Extraction Model Based on Domain N-Gram Adapter and Axial Attention for Military Domain. In: Yuan, L., Yang, S., Li, R., Kanoulas, E., Zhao, X. (eds) Web Information Systems and Applications. WISA 2023. Lecture Notes in Computer Science, vol 14094. Springer, Singapore. https://doi.org/10.1007/978-981-99-6222-8_20

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-6222-8_20

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-6221-1

  • Online ISBN: 978-981-99-6222-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics