Improving Distantly Supervised Relation Extraction by Natural Language Inference

Authors

  • Kang Zhou Iowa State University
  • Qiao Qiao Iowa State University
  • Yuepei Li Iowa State University
  • Qi Li Iowa State University

DOI:

https://doi.org/10.1609/aaai.v37i11.26644

Keywords:

SNLP: Information Extraction, SNLP: Text Mining

Abstract

To reduce human annotations for relation extraction (RE) tasks, distantly supervised approaches have been proposed, while struggling with low performance. In this work, we propose a novel DSRE-NLI framework, which considers both distant supervision from existing knowledge bases and indirect supervision from pretrained language models for other tasks. DSRE-NLI energizes an off-the-shelf natural language inference (NLI) engine with a semi-automatic relation verbalization (SARV) mechanism to provide indirect supervision and further consolidates the distant annotations to benefit multi-classification RE models. The NLI-based indirect supervision acquires only one relation verbalization template from humans as a semantically general template for each relationship, and then the template set is enriched by high-quality textual patterns automatically mined from the distantly annotated corpus. With two simple and effective data consolidation strategies, the quality of training data is substantially improved. Extensive experiments demonstrate that the proposed framework significantly improves the SOTA performance (up to 7.73% of F1) on distantly supervised RE benchmark datasets. Our code is available at https://github.com/kangISU/DSRE-NLI.

Downloads

Published

2023-06-26

How to Cite

Zhou, K., Qiao, Q., Li, Y., & Li, Q. (2023). Improving Distantly Supervised Relation Extraction by Natural Language Inference. Proceedings of the AAAI Conference on Artificial Intelligence, 37(11), 14047-14055. https://doi.org/10.1609/aaai.v37i11.26644

Issue

Section

AAAI Technical Track on Speech & Natural Language Processing