Attention Weights in Transformer NMT Fail Aligning Words Between Sequences but Largely Explain Model Predictions

Javier Ferrando, Marta R. Costa-jussà


Abstract
This work proposes an extensive analysis of the Transformer architecture in the Neural Machine Translation (NMT) setting. Focusing on the encoder-decoder attention mechanism, we prove that attention weights systematically make alignment errors by relying mainly on uninformative tokens from the source sequence. However, we observe that NMT models assign attention to these tokens to regulate the contribution in the prediction of the two contexts, the source and the prefix of the target sequence. We provide evidence about the influence of wrong alignments on the model behavior, demonstrating that the encoder-decoder attention mechanism is well suited as an interpretability method for NMT. Finally, based on our analysis, we propose methods that largely reduce the word alignment error rate compared to standard induced alignments from attention weights.
Anthology ID:
2021.findings-emnlp.39
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
434–443
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.39
DOI:
10.18653/v1/2021.findings-emnlp.39
Bibkey:
Cite (ACL):
Javier Ferrando and Marta R. Costa-jussà. 2021. Attention Weights in Transformer NMT Fail Aligning Words Between Sequences but Largely Explain Model Predictions. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 434–443, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Attention Weights in Transformer NMT Fail Aligning Words Between Sequences but Largely Explain Model Predictions (Ferrando & Costa-jussà, Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.39.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.39.mp4