Not all layers are equally as important: Every Layer Counts BERT

Lucas Georges Gabriel Charpentier, David Samuel


Anthology ID:
2023.conll-babylm.20
Volume:
Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning
Month:
December
Year:
2023
Address:
Singapore
Editors:
Alex Warstadt, Aaron Mueller, Leshem Choshen, Ethan Wilcox, Chengxu Zhuang, Juan Ciro, Rafael Mosquera, Bhargavi Paranjabe, Adina Williams, Tal Linzen, Ryan Cotterell
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
238–252
Language:
URL:
https://aclanthology.org/2023.conll-babylm.20
DOI:
10.18653/v1/2023.conll-babylm.20
Bibkey:
Cite (ACL):
Lucas Georges Gabriel Charpentier and David Samuel. 2023. Not all layers are equally as important: Every Layer Counts BERT. In Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning, pages 238–252, Singapore. Association for Computational Linguistics.
Cite (Informal):
Not all layers are equally as important: Every Layer Counts BERT (Georges Gabriel Charpentier & Samuel, CoNLL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.conll-babylm.20.pdf