Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages

Shivanshu Gupta, Yoshitomo Matsubara, Ankit Chadha, Alessandro Moschitti


Abstract
While impressive performance has been achieved on the task of Answer Sentence Selection (AS2) for English, the same does not hold for languages that lack large labeled datasets. In this work, we propose Cross-Lingual Knowledge Distillation (CLKD) from a strong English AS2 teacher as a method to train AS2 models for low-resource languages in the tasks without the need of labeled data for the target language. To evaluate our method, we introduce 1) Xtr-WikiQA, a translation-based WikiQA dataset for 9 additional languages, and 2) TyDi-AS2, a multilingual AS2 dataset with over 70K questions spanning 8 typologically diverse languages. We conduct extensive experiments on Xtr-WikiQA and TyDi-AS2 with multiple teachers, diverse monolingual and multilingual pretrained language models (PLMs) as students, and both monolingual and multilingual training. The results demonstrate that CLKD either outperforms or rivals even supervised fine-tuning with the same amount of labeled data and a combination of machine translation and the teacher model. Our method can potentially enable stronger AS2 models for low-resource languages, while TyDi-AS2 can serve as the largest multilingual AS2 dataset for further studies in the research community.
Anthology ID:
2023.findings-acl.885
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14078–14092
Language:
URL:
https://aclanthology.org/2023.findings-acl.885
DOI:
10.18653/v1/2023.findings-acl.885
Bibkey:
Cite (ACL):
Shivanshu Gupta, Yoshitomo Matsubara, Ankit Chadha, and Alessandro Moschitti. 2023. Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages. In Findings of the Association for Computational Linguistics: ACL 2023, pages 14078–14092, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages (Gupta et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.885.pdf
Video:
 https://aclanthology.org/2023.findings-acl.885.mp4