Can LMs Store and Retrieve 1-to-N Relational Knowledge?

Haruki Nagasawa, Benjamin Heinzerling, Kazuma Kokuta, Kentaro Inui


Abstract
It has been suggested that pretrained language models can be viewed as knowledge bases. One of the prerequisites for using language models as knowledge bases is how accurately they can store and retrieve world knowledge. It is already revealed that language models can store much 1-to-1 relational knowledge, such as ”country and its capital,” with high memorization accuracy. On the other hand, world knowledge includes not only 1-to-1 but also 1-to-N relational knowledge, such as ”parent and children.”However, it is not clear how accurately language models can handle 1-to-N relational knowledge. To investigate language models’ abilities toward 1-to-N relational knowledge, we start by designing the problem settings. Specifically, we organize the character of 1-to-N relational knowledge and define two essential skills: (i) memorizing multiple objects individually and (ii) retrieving multiple stored objects without excesses or deficiencies at once. We inspect LMs’ ability to handle 1-to-N relational knowledge on the controlled synthesized data. As a result, we report that it is possible to memorize multiple objects with high accuracy, but generalizing the retrieval ability (expressly, enumeration) is challenging.
Anthology ID:
2023.acl-srw.22
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Vishakh Padmakumar, Gisela Vallejo, Yao Fu
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
130–138
Language:
URL:
https://aclanthology.org/2023.acl-srw.22
DOI:
10.18653/v1/2023.acl-srw.22
Bibkey:
Cite (ACL):
Haruki Nagasawa, Benjamin Heinzerling, Kazuma Kokuta, and Kentaro Inui. 2023. Can LMs Store and Retrieve 1-to-N Relational Knowledge?. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 130–138, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Can LMs Store and Retrieve 1-to-N Relational Knowledge? (Nagasawa et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-srw.22.pdf