Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract)
DOI:
https://doi.org/10.1609/aaai.v38i21.30502Keywords:
DMKM: Linked Open Data Knowledge Graphs & KB Completion, NLP: Information Extraction, DMKM: Knowledge Acquisition From The Web, NLP: Large Language ModelsAbstract
Seq-to-seq generative models recently gained attention for solving the relation extraction task. By approaching this problem as an end-to-end task, they surpassed encoder-based-only models. Little research investigated the effects of the output syntaxes on the training process of these models. Moreover, a limited number of approaches were proposed for extracting ready-to-load knowledge graphs following the RDF standard. In this paper, we consider that a set of triples can be linearized in many different ways, and we evaluate the combined effect of the size of the language models and different RDF syntaxes on the task of relation extraction from Wikipedia abstracts.Downloads
Published
2024-03-24
How to Cite
Ringwald, C., Gandon, F., Faron, C., Michel, F., & Abi Akl, H. (2024). Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 38(21), 23631-23632. https://doi.org/10.1609/aaai.v38i21.30502
Issue
Section
AAAI Student Abstract and Poster Program