PMI-Align: Word Alignment With Point-Wise Mutual Information Without Requiring Parallel Training Data

Fatemeh Azadi, Heshaam Faili, Mohammad Javad Dousti


Abstract
Word alignment has many applications including cross-lingual annotation projection, bilingual lexicon extraction, and the evaluation or analysis of translation outputs. Recent studies show that using contextualized embeddings from pre-trained multilingual language models could give us high quality word alignments without the need of parallel training data. In this work, we propose PMI-Align which computes and uses the point-wise mutual information between source and target tokens to extract word alignments, instead of the cosine similarity or dot product which is mostly used in recent approaches. Our experiments show that our proposed PMI-Align approach could outperform the rival methods on five out of six language pairs. Although our approach requires no parallel training data, we show that this method could also benefit the approaches using parallel data to fine-tune pre-trained language models on word alignments. Our code and data are publicly available.
Anthology ID:
2023.findings-acl.782
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12366–12377
Language:
URL:
https://aclanthology.org/2023.findings-acl.782
DOI:
10.18653/v1/2023.findings-acl.782
Bibkey:
Cite (ACL):
Fatemeh Azadi, Heshaam Faili, and Mohammad Javad Dousti. 2023. PMI-Align: Word Alignment With Point-Wise Mutual Information Without Requiring Parallel Training Data. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12366–12377, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
PMI-Align: Word Alignment With Point-Wise Mutual Information Without Requiring Parallel Training Data (Azadi et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.782.pdf