Fixed Input Parameterization for Efficient Prompting

Eunbi Choi, Yongrae Jo, Joel Jang, Joonwon Jang, Minjoon Seo


Abstract
Recent works have shown that attaching prompts to the input is effective at conditioning Language Models (LM) to perform specific tasks. However, prompts are always included in the input text during inference, even when they are fixed, thus incurring substantial computational and memory overhead. Also, there is currently no straightforward method of utilizing prompts that are longer than the maximum input length of the LMs without incurring additional costs during inference. We formally define Fixed Input Parameterization (FIP) problem that focuses on injecting the fixed prompt into the parameters of an LM to be an efficient alternative to attaching fixed prompts to the input. We show that in scenarios with long fixed prompts, FIP can be up to 280 times more efficient in terms of total FLOPs than previous approaches. We further explore methodologies for FIP and show promising results in persona-dependent conversation, semantic parsing, and zero-shot learning with task instructions. Through these explorations, we show that FIP can be a promising direction for conditioning language models, in scenarios with long and fixed prompts.
Anthology ID:
2023.findings-acl.533
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8428–8441
Language:
URL:
https://aclanthology.org/2023.findings-acl.533
DOI:
10.18653/v1/2023.findings-acl.533
Bibkey:
Cite (ACL):
Eunbi Choi, Yongrae Jo, Joel Jang, Joonwon Jang, and Minjoon Seo. 2023. Fixed Input Parameterization for Efficient Prompting. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8428–8441, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Fixed Input Parameterization for Efficient Prompting (Choi et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.533.pdf