A Survey on Efficient Training of Transformers

A Survey on Efficient Training of Transformers

Bohan Zhuang, Jing Liu, Zizheng Pan, Haoyu He, Yuetian Weng, Chunhua Shen

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Survey Track. Pages 6823-6831. https://doi.org/10.24963/ijcai.2023/764

Recent advances in Transformers have come with a huge requirement on computing resources, highlighting the importance of developing efficient training techniques to make Transformer training faster, at lower cost, and to higher accuracy by the efficient use of computation and memory resources. This survey provides the first systematic overview of the efficient training of Transformers, covering the recent progress in acceleration arithmetic and hardware, with a focus on the former. We analyze and compare methods that save computation and memory costs for intermediate tensors during training, together with techniques on hardware/algorithm co-design. We finally discuss challenges and promising areas for future research.
Keywords:
Survey: Machine Learning
Survey: Natural Language Processing
Survey: Computer Vision
Survey: Multidisciplinary Topics and Applications