Skip to main content

Less is More: A Prototypical Framework for Efficient Few-Shot Named Entity Recognition

  • Conference paper
  • First Online:
Natural Language Processing and Information Systems (NLDB 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13913))

Abstract

Few-shot named entity recognition (NER) aims to leverage a small number of labeled examples to extract novel-class named entities from unstructured text. Although existing few-shot NER methods, such as ESD and DecomposedMetaNER, are effective, they are quite complex and not efficient, which makes them unsuitable for real-world applications when the prediction time is a critical factor. In this paper, we propose a simple span-based prototypical framework that follows the metric-based meta-learning paradigm and does not require time-consuming fine-tuning. In addition, the BERT encoding process in our model can be pre-computed and cached, making the final inference process even faster. Experiment results show that, compared with the state-of-the-art models, the proposed framework can achieve comparable effectiveness with much better efficiency.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ba, J., Kiros, J.R., Hinton, G.E.: Layer normalization. arXiv e-print archive (2016)

    Google Scholar 

  2. Cui, L., Wu, Y., Liu, J., Yang, S., Zhang, Y.: Template-based named entity recognition using BART. In: ACL-IJCNLP (2021)

    Google Scholar 

  3. Das, S.S.S., Katiyar, A., Passonneau, R., Zhang, R.: CONTaiNER: few-shot named entity recognition via contrastive learning. In: ACL (2022)

    Google Scholar 

  4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL (2019)

    Google Scholar 

  5. Ding, N., et al.: Few-NERD: a few-shot named entity recognition dataset. In: ACL-IJCNLP (2021)

    Google Scholar 

  6. Friedrich, A., et al.: The SOFC-exp corpus and neural approaches to information extraction in the materials science domain. In: ACL (2020)

    Google Scholar 

  7. Fritzler, A., Logacheva, V., Kretov, M.: Few-shot classification in named entity recognition task. In: SAC (2019)

    Google Scholar 

  8. Fu, J., Huang, X., Liu, P.: SpanNER: named entity re-/recognition as span prediction. In: ACL-IJCNLP (2021)

    Google Scholar 

  9. Gao, T., Han, X., Liu, Z., Sun, M.: Hybrid attention-based prototypical networks for noisy few-shot relation classification. In: AAAI (2019)

    Google Scholar 

  10. Guo, J., et al.: Automated chemical reaction extraction from scientific literature. J. Chem. Inf. Model. 62(9), 2035–2045 (2021)

    Article  Google Scholar 

  11. Hou, Y., et al.: Few-shot slot tagging with collapsed dependency transfer and label-enhanced task-adaptive projection network. In: ACL (2020)

    Google Scholar 

  12. Huang, J., et al.: Few-shot named entity recognition: an empirical baseline study. In: EMNLP (2021)

    Google Scholar 

  13. Kononova, O., et al.: Text-mined dataset of inorganic materials synthesis recipes. Sci. Data 6(1), 1–11 (2019)

    MathSciNet  Google Scholar 

  14. Li, Y., Liu, L., Shi, S.: Empirical analysis of unlabeled entity problem in named entity recognition. In: ICLR (2021)

    Google Scholar 

  15. de Lichy, C., Glaude, H., Campbell, W.: Meta-learning for few-shot named entity recognition. In: 1st Workshop on Meta Learning and Its Applications to Natural Language Processing (2021)

    Google Scholar 

  16. Ma, R., Zhou, X., Gui, T., Tan, Y.C., Zhang, Q., Huang, X.: Template-free prompt tuning for few-shot NER. In: NAACL (2022)

    Google Scholar 

  17. Ma, T., Jiang, H., Wu, Q., Zhao, T., Lin, C.Y.: Decomposed meta-learning for few-shot named entity recognition. In: ACL (2022)

    Google Scholar 

  18. Ouchi, H., et al.: Instance-based learning of span representations: a case study through named entity recognition. In: ACL (2020)

    Google Scholar 

  19. Reimers, N., Gurevych, I.: The curse of dense low-dimensional information retrieval for large index sizes. In: ACL (2021)

    Google Scholar 

  20. Snell, J., Swersky, K., Zemel, R.S.: Prototypical networks for few-shot learning. In: NIPS (2017)

    Google Scholar 

  21. Sung, F., Yang, Y., Zhang, L., Xiang, T., Torr, P.H.S., Hospedales, T.M.: Learning to compare: relation network for few-shot learning. In: CVPR (2018)

    Google Scholar 

  22. Vinyals, O., Blundell, C., Lillicrap, T., Kavukcuoglu, K., Wierstra, D.: Matching networks for one shot learning. In: NISP (2016)

    Google Scholar 

  23. Wang, P., et al.: An enhanced span-based decomposition method for few-shot sequence labeling. In: NAACL (2022)

    Google Scholar 

  24. Wang, X., Han, X., Huang, W., Dong, D., Scott, M.R.: Multi-similarity loss with general pair weighting for deep metric learning. In: CVPR (2019)

    Google Scholar 

  25. Wang, Y., Chu, H., Zhang, C., Gao, J.: Learning from language description: low-shot named entity recognition via decomposed framework. In: EMNLP (2021)

    Google Scholar 

  26. Weston, L., et al.: Named entity recognition and normalization applied to large-scale information extraction from the materials science literature. J. Chem. Inf. Model. 59(9), 3692–3702 (2019)

    Article  Google Scholar 

  27. Yang, Y., Katiyar, A.: Simple and effective few-shot named entity recognition with structured nearest neighbor learning. In: EMNLP (2020)

    Google Scholar 

  28. Yu, J., Bohnet, B., Poesio, M.: Named entity recognition as dependency parsing. In: ACL (2020)

    Google Scholar 

  29. Zhang, Y., Wang, C., Soukaseum, M., Vlachos, D.G., Fang, H.: Unleashing the power of knowledge extraction from scientific literature in catalysis. J. Chem. Inf. Model. 62(14), 3316–3330 (2022)

    Article  Google Scholar 

Download references

Acknowledgements

We greatly thanks all reviewers for their constructive comments. The research was supported as part of the Center for Plastics Innovation, an Energy Frontier Research Center, funded by the U.S. Department of Energy (DOE), Office of Science, Basic Energy Sciences (BES), under Award Number DE-SC0021166. The research was also supported in part through the use of DARWIN computing system: DARWIN - A Resource for Computational and Data-intensive Research at the University of Delaware and in the Delaware Region, Rudolf Eigenmann, Benjamin E. Bagozzi, Arthi Jayaraman, William Totten, and Cathy H. Wu, University of Delaware, 2021, URL: https://udspace.udel.edu/handle/19716/29071.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yue Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, Y., Fang, H. (2023). Less is More: A Prototypical Framework for Efficient Few-Shot Named Entity Recognition. In: Métais, E., Meziane, F., Sugumaran, V., Manning, W., Reiff-Marganiec, S. (eds) Natural Language Processing and Information Systems. NLDB 2023. Lecture Notes in Computer Science, vol 13913. Springer, Cham. https://doi.org/10.1007/978-3-031-35320-8_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35320-8_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35319-2

  • Online ISBN: 978-3-031-35320-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics