Verb Conjugation in Transformers Is Determined by Linear Encodings of Subject Number

Sophie Hao, Tal Linzen


Abstract
Deep architectures such as Transformers are sometimes criticized for having uninterpretable “black-box” representations. We use causal intervention analysis to show that, in fact, some linguistic features are represented in a linear, interpretable format. Specifically, we show that BERT’s ability to conjugate verbs relies on a linear encoding of subject number that can be manipulated with predictable effects on conjugation accuracy. This encoding is found in the subject position at the first layer and the verb position at the last layer, but distributed across positions at middle layers, particularly when there are multiple cues to subject number.
Anthology ID:
2023.findings-emnlp.300
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4531–4539
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.300
DOI:
10.18653/v1/2023.findings-emnlp.300
Bibkey:
Cite (ACL):
Sophie Hao and Tal Linzen. 2023. Verb Conjugation in Transformers Is Determined by Linear Encodings of Subject Number. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 4531–4539, Singapore. Association for Computational Linguistics.
Cite (Informal):
Verb Conjugation in Transformers Is Determined by Linear Encodings of Subject Number (Hao & Linzen, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.300.pdf